Higher-Order Modular Attention: Fusing Pairwise and Triadic Interactions for Protein Sequences
arXiv:2603.11133v1 Announce Type: new Abstract: Transformer self-attention computes pairwise token interactions, yet protein sequence to phenotype relationships often involve cooperative dependencies among three or more residues that dot product attention does not capture explicitly. We introduce Higher-Order Modular Attention, HOMA,…
