Back to Search Start Over

Fast Augmented Lagrangian Method in the convex regime with convergence guarantees for the iterates.

Authors :
Boţ, Radu Ioan
Csetnek, Ernö Robert
Nguyen, Dang-Khoa
Source :
Mathematical Programming. Jun2023, Vol. 200 Issue 1, p147-197. 51p.
Publication Year :
2023

Abstract

This work aims to minimize a continuously differentiable convex function with Lipschitz continuous gradient under linear equality constraints. The proposed inertial algorithm results from the discretization of the second-order primal-dual dynamical system with asymptotically vanishing damping term addressed by Boţ and Nguyen (J. Differential Equations 303:369–406, 2021), and it is formulated in terms of the Augmented Lagrangian associated with the minimization problem. The general setting we consider for the inertial parameters covers the three classical rules by Nesterov, Chambolle–Dossal and Attouch–Cabot used in the literature to formulate fast gradient methods. For these rules, we obtain in the convex regime convergence rates of order O 1 / k 2 for the primal-dual gap, the feasibility measure, and the objective function value. In addition, we prove that the generated sequence of primal-dual iterates converges to a primal-dual solution in a general setting that covers the two latter rules. This is the first result which provides the convergence of the sequence of iterates generated by a fast algorithm for linearly constrained convex optimization problems without additional assumptions such as strong convexity. We also emphasize that all convergence results of this paper are compatible with the ones obtained in Boţ and Nguyen (J. Differential Equations 303:369–406, 2021) in the continuous setting. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
200
Issue :
1
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
163797947
Full Text :
https://doi.org/10.1007/s10107-022-01879-4