Sorry, I don't understand your search. ×
Back to Search Start Over

Subsampled Hessian Newton Methods for Supervised Learning

Authors :
Chih-Jen Lin
Chun-Heng Huang
Chien-Chih Wang
Source :
Neural Computation. 27:1766-1795
Publication Year :
2015
Publisher :
MIT Press - Journals, 2015.

Abstract

Newton methods can be applied in many supervised learning approaches. However, for large-scale data, the use of the whole Hessian matrix can be time-consuming. Recently, subsampled Newton methods have been proposed to reduce the computational time by using only a subset of data for calculating an approximation of the Hessian matrix. Unfortunately, we find that in some situations, the running speed is worse than the standard Newton method because cheaper but less accurate search directions are used. In this work, we propose some novel techniques to improve the existing subsampled Hessian Newton method. The main idea is to solve a two-dimensional subproblem per iteration to adjust the search direction to better minimize the second-order approximation of the function value. We prove the theoretical convergence of the proposed method. Experiments on logistic regression, linear SVM, maximum entropy, and deep networks indicate that our techniques significantly reduce the running time of the subsampled Hessian Newton method. The resulting algorithm becomes a compelling alternative to the standard Newton method for large-scale data classification.

Details

ISSN :
1530888X and 08997667
Volume :
27
Database :
OpenAIRE
Journal :
Neural Computation
Accession number :
edsair.doi.dedup.....91f2b0ce902f84d5214bf19cfb490c91