Back to Search Start Over

Stochastic noise can be helpful for variational quantum algorithms

Authors :
Liu, Junyu
Wilde, Frederik
Mele, Antonio Anna
Jiang, Liang
Eisert, Jens
Publication Year :
2022

Abstract

Saddle points constitute a crucial challenge for first-order gradient descent algorithms. In notions of classical machine learning, they are avoided for example by means of stochastic gradient descent methods. In this work, we provide evidence that the saddle points problem can be naturally avoided in variational quantum algorithms by exploiting the presence of stochasticity. We prove convergence guarantees and present practical examples in numerical simulations and on quantum hardware. We argue that the natural stochasticity of variational algorithms can be beneficial for avoiding strict saddle points, i.e., those saddle points with at least one negative Hessian eigenvalue. This insight that some levels of shot noise could help is expected to add a new perspective to notions of near-term variational quantum algorithms.<br />Comment: 16 pages, 14 figures, presentation improved, proofs extended

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2210.06723
Document Type :
Working Paper