Back to Search Start Over

Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation

Authors :
de Klerk, Etienne
Glineur, Francois
Taylor, Adrien
Publication Year :
2017

Abstract

We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions, including the case of inexact search directions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori and Teboulle [Mathematical Programming, 145(1-2):451-482, 2014], and extends recent performance estimation results for the method of Cauchy by the authors [Optimization Letters, 11(7), 1185-1199, 2017]. To illustrate the applicability of the tools, we demonstrate a novel complexity analysis of short step interior point methods using inexact search directions. As an example in this framework, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [PMLR, 48:2520-2528, 2016].<br />Comment: 22 pages, 1 figure. Title of earlier version was "Worst-case convergence analysis of gradient and Newton methods through semidefinite programming performance estimation"

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1709.05191
Document Type :
Working Paper