Back to Search Start Over

Universal gradient methods for convex optimization problems.

Authors :
Nesterov, Yu
Source :
Mathematical Programming. Aug2015, Vol. 152 Issue 1/2, p381-404. 24p.
Publication Year :
2015

Abstract

In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
152
Issue :
1/2
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
108355409
Full Text :
https://doi.org/10.1007/s10107-014-0790-0