RMAAT: Astrocyte-Inspired Memory Compression and Replay for Efficient Long-Context Transformers
arXiv:2601.00426v1 Announce Type: cross Abstract: The quadratic complexity of self-attention mechanism presents a significant impediment to applying Transformer models to long sequences. This work explores computational principles derived from astrocytes-glial cells critical for biological memory and synaptic modulation-as a complementary…
