CONJUGATE gradient methods, APPROXIMATION theory, CONSTRAINED optimization, MATHEMATICAL optimization, LINEAR differential equations, LINEAR systems, NUMERICAL analysis, STOCHASTIC convergence, ALGORITHMS
Abstract
This paper proposes a line search technique to satisfy a relaxed form of the strong Wolfe conditions in order to guarantee the descent condition at each iteration of the Polak-Ribière-Polyak conjugate gradient algorithm. It is proved that this line search algorithm preserves the usual convergence properties of any descent algorithm. In particular, it is shown that the Zoutendijk condition holds under mild assumptions. It is also proved that the resulting conjugate gradient algorithm is convergent under a strong convexity assumption. For the nonconvex case, a globally convergent modification is proposed. Numerical tests are presented. [ABSTRACT FROM AUTHOR]
CONJUGATE gradient methods, LANCZOS method, APPROXIMATION theory, NUMERICAL solutions to equations, CONSTRAINED optimization, MATHEMATICAL optimization, LINEAR differential equations, LINEAR systems, ALGORITHMS
Abstract
This paper extends some theoretical properties of the conjugate gradient-type method FLR (Ref. 1) for iteratively solving indefinite linear systems of equations. The latter algorithm is a generalization of the conjugate gradient method by Hestenes and Stiefel (CG, Ref. 2).We develop a complete relationship between the FLR algorithm and the Lanczos process, in the case of indefinite and possibly singular matrices. Then, we develop simple theoretical results for the FLR algorithm in order to construct an approximation of the Moore-Penrose pseudoinverse of an indefinite matrix. Our approach supplies the theoretical framework for applications within unconstrained optimization. [ABSTRACT FROM AUTHOR]