Task-Driven Kernel Flows: Label Rank Compression and Laplacian Spectral Filtering

2026-01-04 20:00 GMT · 4 months ago aimagpro.com

arXiv:2601.00276v1 Announce Type: new
Abstract: We present a theory of feature learning in wide L2-regularized networks showing that supervised learning is inherently compressive. We derive a kernel ODE that predicts a “water-filling” spectral evolution and prove that for any stable steady state, the kernel rank is bounded by the number of classes ($C$). We further demonstrate that SGD noise is similarly low-rank ($O(C)$), confining dynamics to the task-relevant subspace. This framework unifies the deterministic and stochastic views of alignment and contrasts the low-rank nature of supervised learning with the high-rank, expansive representations of self-supervision.