Back to Search Start Over

Stochastic Cubic Regularization for Fast Nonconvex Optimization

Authors :
Tripuraneni, Nilesh
Stern, Mitchell
Jin, Chi
Regier, Jeffrey
Jordan, Michael I.
Publication Year :
2017

Abstract

This paper proposes a stochastic variant of a classic algorithm---the cubic-regularized Newton method [Nesterov and Polyak 2006]. The proposed algorithm efficiently escapes saddle points and finds approximate local minima for general smooth, nonconvex functions in only $\mathcal{\tilde{O}}(\epsilon^{-3.5})$ stochastic gradient and stochastic Hessian-vector product evaluations. The latter can be computed as efficiently as stochastic gradients. This improves upon the $\mathcal{\tilde{O}}(\epsilon^{-4})$ rate of stochastic gradient descent. Our rate matches the best-known result for finding local minima without requiring any delicate acceleration or variance-reduction techniques.<br />Comment: The first two authors contributed equally

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1711.02838
Document Type :
Working Paper