Back to Search Start Over

Gradient regularization of Newton method with Bregman distances.

Authors :
Doikov, Nikita
Nesterov, Yurii
Source :
Mathematical Programming. Mar2024, Vol. 204 Issue 1/2, p1-25. 25p.
Publication Year :
2024

Abstract

In this paper, we propose a first second-order scheme based on arbitrary non-Euclidean norms, incorporated by Bregman distances. They are introduced directly in the Newton iterate with regularization parameter proportional to the square root of the norm of the current gradient. For the basic scheme, as applied to the composite convex optimization problem, we establish the global convergence rate of the order O (k - 2) both in terms of the functional residual and in the norm of subgradients. Our main assumption on the smooth part of the objective is Lipschitz continuity of its Hessian. For uniformly convex functions of degree three, we justify global linear rate, and for strongly convex function we prove the local superlinear rate of convergence. Our approach can be seen as a relaxation of the Cubic Regularization of the Newton method (Nesterov and Polyak in Math Program 108(1):177–205, 2006) for convex minimization problems. This relaxation preserves the convergence properties and global complexities of the Cubic Newton in convex case, while the auxiliary subproblem at each iteration is simpler. We equip our method with adaptive search procedure for choosing the regularization parameter. We propose also an accelerated scheme with convergence rate O (k - 3) , where k is the iteration counter. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
204
Issue :
1/2
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
175451824
Full Text :
https://doi.org/10.1007/s10107-023-01943-7