Back to Search Start Over

An Efficient Barzilai-Borwein Conjugate Gradient Method for Unconstrained Optimization.

Authors :
Liu, Hongwei
Liu, Zexian
Source :
Journal of Optimization Theory & Applications. Mar2019, Vol. 180 Issue 3, p879-906. 28p.
Publication Year :
2019

Abstract

The Barzilai-Borwein conjugate gradient methods, which were first proposed by Dai and Kou (Sci China Math 59(8):1511-1524, 2016), are very interesting and very efficient for strictly convex quadratic minimization. In this paper, we present an efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. Motivated by the Barzilai-Borwein method and the linear conjugate gradient method, we derive a new search direction satisfying the sufficient descent condition based on a quadratic model in a two-dimensional subspace, and design a new strategy for the choice of initial stepsize. A generalized Wolfe line search is also proposed, which is nonmonotone and can avoid a numerical drawback of the original Wolfe line search. Under mild conditions, we establish the global convergence and the R-linear convergence of the proposed method. In particular, we also analyze the convergence for convex functions. Numerical results show that, for the CUTEr library and the test problem collection given by Andrei, the proposed method is superior to two famous conjugate gradient methods, which were proposed by Dai and Kou (SIAM J Optim 23(1):296-320, 2013) and Hager and Zhang (SIAM J Optim 16(1):170-192, 2005), respectively. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00223239
Volume :
180
Issue :
3
Database :
Academic Search Index
Journal :
Journal of Optimization Theory & Applications
Publication Type :
Academic Journal
Accession number :
134716723
Full Text :
https://doi.org/10.1007/s10957-018-1393-3