Back to Search Start Over

Conditions for linear convergence of the gradient method for non-convex optimization

Authors :
Abbaszadehpeivasti, Hadi
de Klerk, Etienne
Zamani, Moslem
Publication Year :
2022

Abstract

In this paper, we derive a new linear convergence rate for the gradient method with fixed step lengths for non-convex smooth optimization problems satisfying the Polyak-Lojasiewicz (PL) inequality. We establish that the PL inequality is a necessary and sufficient condition for linear convergence to the optimal value for this class of problems. We list some related classes of functions for which the gradient method may enjoy linear convergence rate. Moreover, we investigate their relationship with the PL inequality.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2204.00647
Document Type :
Working Paper