Back to Search Start Over

NEW PRIMAL-DUAL ALGORITHMS FOR A CLASS OF NONSMOOTH AND NONLINEAR CONVEX-CONCAVE MINIMAX PROBLEMS.

Authors :
YUZIXUAN ZHU
DEYI LIU
QUOC TRAN-DINH
Source :
SIAM Journal on Optimization. 2022, Vol. 32 Issue 4, p2580-2611. 32p.
Publication Year :
2022

Abstract

In this paper, we develop new primal-dual algorithms to solve a class of nonsmooth and nonlinear convex-concave minimax problems, which covers many existing and brand-new models as special cases. Our approach relies on a combination of a generalized augmented Lagrangian function, Nesterov's accelerated scheme, and adaptive parameter updating strategies. Our algorithmic framework is single-loop and unifies two important settings: general convex-concave and convex-linear cases. Under mild assumptions, our algorithms achieve O (1/k) convergence rates through three different criteria: primal-dual gap, primal objective residual, and dual objective residual, where k is the iteration counter. Our rates are both ergodic (i.e., on a weighted averaging sequence) and nonergodic (i.e., on the last-iterate sequence). These convergence rates can be boosted up to O (1/k²) if only one objective term is strongly convex (or, equivalently, its conjugate is L-smooth). To the best of our knowledge, this is the first algorithm achieving optimal rates on the primal last-iterate sequence for convex-linear minimax problems. As a byproduct, we specify our algorithms to solve a general convex cone constrained program with both ergodic and nonergodic rate guarantees. We test our algorithms and compare them with two recent methods on two numerical examples. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
32
Issue :
4
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
161593487
Full Text :
https://doi.org/10.1137/21M1408683