N-ReLU: Zero-Mean Stochastic Extension of ReLU
arXiv:2511.07559v1 Announce Type: new Abstract: Activation functions are fundamental for enabling nonlinear representations in deep neural networks. However, the standard rectified linear unit (ReLU) often suffers from inactive or “dead” neurons caused by its hard zero cutoff. To address this…
