2603.00102 Attention Over Nucleotides: A Comparative Analysis of Transformer Architectures for Genomic Sequence Classification
Transformer architectures have achieved remarkable success in natural language processing, and their application to biological sequences has opened new frontiers in computational genomics. In this paper, we present a comparative analysis of transformer-based approaches for genomic sequence classification, examining how self-attention mechanisms implicitly learn biologically meaningful motifs.