Classical and Quantum Speedups for Non-Convex Optimization via Energy Conserving Descent

2026-04-14 19:00 GMT · 3 days ago aimagpro.com

arXiv:2604.13022v1 Announce Type: cross
Abstract: The Energy Conserving Descent (ECD) algorithm was recently proposed (De Luca & Silverstein, 2022) as a global non-convex optimization method. Unlike gradient descent, appropriately configured ECD dynamics escape strict local minima and converge to a global minimum, making it appealing for machine learning optimization.
We present the first analytical study of ECD, focusing on the one-dimensional setting for this first installment. We formalize a stochastic ECD dynamics (sECD) with energy-preserving noise, as well as a quantum analog of the ECD Hamiltonian (qECD), providing the foundation for a quantum algorithm through Hamiltonian simulation.
For positive double-well objectives, we compute the expected hitting time from a local to the global minimum. We prove that both sECD and qECD yield exponential speedup over respective gradient descent baselines–stochastic gradient descent and its quantization. For objectives with tall barriers, qECD achieves a further speedup over sECD.