Back to Search Start Over

A class of conjugate gradient methods for convex constrained monotone equations.

Authors :
Ding, Yanyun
Xiao, Yunhai
Li, Jianwei
Source :
Optimization. 2017, Vol. 66 Issue 12, p2309-2328. 20p.
Publication Year :
2017

Abstract

The recent designed non-linear conjugate gradient method of Dai and Kou [SIAM J Optim. 2013;23:296–320] is very efficient currently in solving large-scale unconstrained minimization problems due to its simpler iterative form, lower storage requirement and its closeness to the scaled memoryless BFGS method. Just because of these attractive properties, this method was extended successfully to solve higher dimensional symmetric non-linear equations in recent years. Nevertheless, its numerical performance in solving convex constrained monotone equations has never been explored. In this paper, combining with the projection method of Solodov and Svaiter, we develop a family of non-linear conjugate gradient methods for convex constrained monotone equations. The proposed methods do not require the Jacobian information of equations, and even they do not store any matrix in each iteration. They are potential to solve non-smooth problems with higher dimensions. We prove the global convergence of the class of the proposed methods and establish its R-linear convergence rate under some reasonable conditions. Finally, we also do some numerical experiments to show that the proposed methods are efficient and promising. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02331934
Volume :
66
Issue :
12
Database :
Academic Search Index
Journal :
Optimization
Publication Type :
Academic Journal
Accession number :
125881225
Full Text :
https://doi.org/10.1080/02331934.2017.1372438