Back to Search
Start Over
An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem.
- Source :
-
Optimization . Mar2018, Vol. 67 Issue 3, p427-440. 14p. - Publication Year :
- 2018
-
Abstract
- In this paper, a new type of stepsize, approximate optimal stepsize, for gradient method is introduced to interpret the Barzilai–Borwein (BB) method, and an efficient gradient method with an approximate optimal stepsize for the strictly convex quadratic minimization problem is presented. Based on a multi-step quasi-Newton condition, we construct a new quadratic approximation model to generate an approximate optimal stepsize. We then use the two well-known BB stepsizes to truncate it for improving numerical effects and treat the resulted approximate optimal stepsize as the new stepsize for gradient method. We establish the global convergence andR-linear convergence of the proposed method. Numerical results show that the proposed method outperforms some well-known gradient methods. [ABSTRACT FROM PUBLISHER]
Details
- Language :
- English
- ISSN :
- 02331934
- Volume :
- 67
- Issue :
- 3
- Database :
- Academic Search Index
- Journal :
- Optimization
- Publication Type :
- Academic Journal
- Accession number :
- 127587128
- Full Text :
- https://doi.org/10.1080/02331934.2017.1399392