arXiv:2502.04832v2 Announce Type: replace-cross Abstract: The total memory capacity (MC) of linear recurrent neural networks (RNNs) has been proven to be equal to the rank of the corresponding Kalman controllability matrix, and it is almost surely maximal for connectivity and input weight matrices drawn from regular distributions. This fact questions the usefulness of this metric in distinguishing the performance of linear RNNs in the processing of stochastic signals. This work shows that the MC of random nonlinear RNNs yields arbitrary values within established upper and lower bounds depending exclusively on the scale of the input process. This confirms that the existing definition of MC in linear and nonlinear cases has no practical value.
Original: https://arxiv.org/abs/2502.04832