Back to Search Start Over

Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions.

Authors :
Yuan, Gonglin
Wang, Xiaoliang
Sheng, Zhou
Source :
Numerical Algorithms. Jul2020, Vol. 84 Issue 3, p935-956. 22p.
Publication Year :
2020

Abstract

It is well-known that conjugate gradient algorithms are widely applied in many practical fields, for instance, engineering problems and finance models, as they are straightforward and characterized by a simple structure and low storage. However, challenging problems remain, such as the convergence of the PRP algorithms for nonconvexity under an inexact line search, obtaining a sufficient descent for all conjugate gradient methods, and other theory properties regarding global convergence and the trust region feature for nonconvex functions. This paper studies family conjugate gradient formulas based on the six classic formulas, PRP, HS, CD, FR, LS, and DY, where the family conjugate gradient algorithms have better theory properties than those of the formulas by themselves. Furthermore, this technique of the presented conjugate gradient formulas can be extended to any two-term conjugate gradient formula. This paper designs family conjugate gradient algorithms for nonconvex functions, which have the following features without other conditions: (i) the sufficient descent property holds, (ii) the trust region feature is true, and (iii) the global convergence holds under normal assumptions. Numerical results show that the given conjugate gradient algorithms are competitive with those of normal methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10171398
Volume :
84
Issue :
3
Database :
Academic Search Index
Journal :
Numerical Algorithms
Publication Type :
Academic Journal
Accession number :
143699633
Full Text :
https://doi.org/10.1007/s11075-019-00787-7