Back to Search Start Over

Acceleration of conjugate gradient algorithms for unconstrained optimization

Authors :
Andrei, Neculai
Source :
Applied Mathematics & Computation. Jul2009, Vol. 213 Issue 2, p361-369. 9p.
Publication Year :
2009

Abstract

Abstract: Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modification of steplength. The idea is to modify in a multiplicative manner the steplength α k , computed by Wolfe line search conditions, by means of a positive parameter η k , in such a way to improve the behavior of the classical conjugate gradient algorithms. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the accelerated computational scheme outperform the corresponding conjugate gradient algorithms. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
00963003
Volume :
213
Issue :
2
Database :
Academic Search Index
Journal :
Applied Mathematics & Computation
Publication Type :
Academic Journal
Accession number :
40112469
Full Text :
https://doi.org/10.1016/j.amc.2009.03.020