Back to Search
Start Over
An Accelerated Linearly Convergent Stochastic L-BFGS Algorithm.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . Nov2019, Vol. 30 Issue 11, p3338-3346. 9p. - Publication Year :
- 2019
-
Abstract
- The limited memory version of the Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) algorithm is the most popular quasi-Newton algorithm in machine learning and optimization. Recently, it was shown that the stochastic L-BFGS (sL-BFGS) algorithm with the variance-reduced stochastic gradient converges linearly. In this paper, we propose a new sL-BFGS algorithm by importing a proper momentum. We prove an accelerated linear convergence rate under mild conditions. The experimental results on different data sets also verify this acceleration advantage. [ABSTRACT FROM AUTHOR]
- Subjects :
- *ALGORITHMS
*MACHINE learning
*INTERIOR-point methods
Subjects
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 30
- Issue :
- 11
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 139436786
- Full Text :
- https://doi.org/10.1109/TNNLS.2019.2891088