Back to Search
Start Over
Accelerated Proximal Subsampled Newton Method.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . Oct2021, Vol. 32 Issue 10, p4374-4388. 15p. - Publication Year :
- 2021
-
Abstract
- Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newton-type proximal method and propose a novel algorithm called accelerated proximal subsampled Newton method (APSSN). APSSN only subsamples a small subset of samples to construct an approximate Hessian that achieves computational efficiency. At the same time, APSSN still keeps a fast convergence rate. Furthermore, we obtain the scaled proximal mapping by solving its dual problem using the semismooth Newton method instead of resorting to the first-order methods. Due to our sampling strategy and the fast convergence rate of the semismooth Newton method, we can get the scaled proximal mapping efficiently. Both our theoretical analysis and empirical study show that APSSN is an effective and computationally efficient algorithm for composite function optimization problems. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 32
- Issue :
- 10
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 153789418
- Full Text :
- https://doi.org/10.1109/TNNLS.2020.3017555