Back to Search
Start Over
Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions.
- Source :
- Mathematical Programming; May2021, Vol. 187 Issue 1/2, p151-193, 43p
- Publication Year :
- 2021
-
Abstract
- In this paper we study the convergence properties of a Nesterov's family of inertial schemes which is a specific case of inertial Gradient Descent algorithm in the context of a smooth convex minimization problem, under some additional hypotheses on the local geometry of the objective function F, such as the growth (or Łojasiewicz) condition. In particular we study the different convergence rates for the objective function and the local variation, depending on these geometric conditions. In this setting we can give optimal convergence rates for this Nesterov scheme. Our analysis shows that there are some situations when Nesterov's family of inertial schemes is asymptotically less efficient than the gradient descent (e.g. in the case when the objective function is quadratic). [ABSTRACT FROM AUTHOR]
- Subjects :
- GENEALOGY
ALGORITHMS
GEOMETRY
HYPOTHESIS
NONSMOOTH optimization
Subjects
Details
- Language :
- English
- ISSN :
- 00255610
- Volume :
- 187
- Issue :
- 1/2
- Database :
- Complementary Index
- Journal :
- Mathematical Programming
- Publication Type :
- Academic Journal
- Accession number :
- 149905468
- Full Text :
- https://doi.org/10.1007/s10107-020-01476-3