Back to Search Start Over

Limited-memory BFGS with displacement aggregation.

Authors :
Berahas, Albert S.
Curtis, Frank E.
Zhou, Baoyu
Source :
Mathematical Programming. Jul2022, Vol. 194 Issue 1/2, p121-157. 37p.
Publication Year :
2022

Abstract

A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing the limited-memory method can achieve the same theoretical convergence properties as when full-memory (inverse) Hessian approximations are stored and employed, such as a local superlinear rate of convergence under assumptions that are common for attaining such guarantees. To the best of our knowledge, this is the first work in which a local superlinear convergence rate guarantee is offered by a quasi-Newton scheme that does not either store all curvature pairs throughout the entire run of the optimization algorithm or store an explicit (inverse) Hessian approximation. Numerical results are presented to show that displacement aggregation within an adaptive L-BFGS scheme can lead to better performance than standard L-BFGS. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
194
Issue :
1/2
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
157667593
Full Text :
https://doi.org/10.1007/s10107-021-01621-6