Back to Search Start Over

ACCELERATED STOCHASTIC ALGORITHMS FOR NONCONVEX FINITE-SUM AND MULTIBLOCK OPTIMIZATION.

Authors :
GUANGHUI LAN
YU YANG
Source :
SIAM Journal on Optimization. 2019, Vol. 29 Issue 4, p2753-2784. 32p.
Publication Year :
2019

Abstract

In this paper, we present new stochastic methods for solving two important classes of nonconvex optimization problems. We first introduce a randomized accelerated proximal gradient(RapGrad) method for solving a class of nonconvex optimization problems whose objective function consists of the summation of m components and show that it can significantly reduce the number of gradient computations especially when the condition number L/μ (i.e., the ratio between the Lipschitz constant and negative curvature) is large. More specifically, RapGrad can save up to O ( √m) gradient computations more than existing batch nonconvex accelerated gradient methods. Moreover, the number of gradient computations required by RapGrad can be O (m1/6 L 1/2/ μ1/2) (atleast O (m 2/3)) times smaller than the best-known randomized nonconvex gradient methods when L/μ ≥ m. Inspired by RapGrad, we also develop a new randomized accelerated proximal dual (RapDual) method for solving a class of multiblock nonconvex optimization problems coupled with linear constraints and some special structural properties. We demonstrate that RapDual can also save up to a factor of O (√m) block updates more than its batch counterpart, where m denotes the number of blocks. To the best of our knowledge, all these complexity results associated with RapGrad and RapDual seem to be new in the literature. We also illustrate potential advantages of these algorithms through our preliminary numerical experiments. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
29
Issue :
4
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
144836802
Full Text :
https://doi.org/10.1137/18M1192536