1. A NEWTON-CG BASED AUGMENTED LAGRANGIAN METHOD FOR FINDING A SECOND-ORDER STATIONARY POINT OF NONCONVEX EQUALITY CONSTRAINED OPTIMIZATION WITH COMPLEXITY GUARANTEES.
- Author
-
CHUAN HE, ZHAOSONG LU, and TING KEI PONG
- Subjects
- *
CONJUGATE gradient methods , *QUASI-Newton methods , *MATHEMATICS , *PROBABILITY theory , *ALGORITHMS - Abstract
In this paper we consider finding a second-order stationary point (SOSP) of nonconvex equality constrained optimization when a nearly feasible point is known. In particular, we first propose a new Newton-conjugate gradient (Newton-CG) method for finding an approximate SOSP of unconstrained optimization and show that it enjoys a substantially better complexity than the Newton-CG method in [C. W. Royer, M. O'Neill, and S. J. Wright, Math. Program., 180 (2020), pp. 451-488]. We then propose a Newton-CG based augmented Lagrangian (AL) method for finding an approximate SOSP of nonconvex equality constrained optimization, in which the proposed Newton-CG method is used as a subproblem solver. We show that under a generalized linear independence constraint qualification (GLICQ), our AL method enjoys a total inner iteration complexity of O(ε7/peration complexity of O(ε7/2 min\{ n, ε3/4\}) for finding an (ε, ε)-SOSP of nonconvex equality constrained optimization with high probability, which are significantly better than the ones achieved by the proximal AL method in [Y. Xie and S. J. Wright, J. Sci. Comput., 86 (2021), pp. 1-30]. In addition, we show that it has a total inner iteration complexity of O(ε11/2) and an operation complexity of O(ε11/2 min\{ n, ε5/4\}) when the GLICQ does not hold. To the best of our knowledge, all the complexity results obtained in this paper are new for finding an approximate SOSP of nonconvex equality constrained optimization with high probability. Preliminary numerical results also demonstrate the superiority of our proposed methods over the other competing algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF