A Comparative Analysis of Contextual Representation Flow in State-Space and Transformer Architectures
arXiv:2510.06640v2 Announce Type: replace-cross Abstract: State Space Models (SSMs) have recently emerged as efficient alternatives to Transformer-Based Models (TBMs) for long-sequence processing with linear scaling, yet how contextual information flows across layers in these architectures remains understudied. We present the…
