Enhancing DPSGD via Per-Sample Momentum and Low-Pass Filtering
arXiv:2511.08841v1 Announce Type: new Abstract: Differentially Private Stochastic Gradient Descent (DPSGD) is widely used to train deep neural networks with formal privacy guarantees. However, the addition of differential privacy (DP) often degrades model accuracy by introducing both noise and bias.…
