Back to Search Start Over

Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization.

Authors :
Ghadimi, Saeed
Lan, Guanghui
Zhang, Hongchao
Source :
Mathematical Programming. Jan2016, Vol. 155 Issue 1/2, p267-305. 39p.
Publication Year :
2016

Abstract

This paper considers a class of constrained stochastic composite optimization problems whose objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a certain non-differentiable (but convex) component. In order to solve these problems, we propose a randomized stochastic projected gradient (RSPG) algorithm, in which proper mini-batch of samples are taken at each iteration depending on the total budget of stochastic samples allowed. The RSPG algorithm also employs a general distance function to allow taking advantage of the geometry of the feasible region. Complexity of this algorithm is established in a unified setting, which shows nearly optimal complexity of the algorithm for convex stochastic programming. A post-optimization phase is also proposed to significantly reduce the variance of the solutions returned by the algorithm. In addition, based on the RSPG algorithm, a stochastic gradient free algorithm, which only uses the stochastic zeroth-order information, has been also discussed. Some preliminary numerical results are also provided. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
155
Issue :
1/2
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
112082281
Full Text :
https://doi.org/10.1007/s10107-014-0846-1