Back to Search Start Over

Subsampled Hessian Newton Methods for Supervised Learning.

Authors :
Chien-Chih Wang
Chun-Heng Huang
Chih-Jen Lin
Source :
Neural Computation. 2015, Vol. 27 Issue 8, p1766-1795. 30p. 1 Diagram, 5 Charts, 6 Graphs.
Publication Year :
2015

Abstract

Newton methods can be applied in many supervised learning approaches. However, for large-scale data, the use of the whole Hessian matrix can be time-consuming. Recently, subsampled Newton methods have been proposed to reduce the computational time by using only a subset of data for calculating an approximation of the Hessian matrix. Unfortunately, we find that in some situations, the running speed is worse than the standard Newton method because cheaper but less accurate search directions are used. In this work, we propose some novel techniques to improve the existing subsampled Hessian Newton method. The main idea is to solve a two-dimensional subproblem per iteration to adjust the search direction to better minimize the second-order approximation of the function value. We prove the theoretical convergence of the proposed method. Experiments on logistic regression, linear SVM, maximum entropy, and deep networks indicate that our techniques significantly reduce the running time of the subsampled Hessian Newton method. The resulting algorithm becomes a compelling alternative to the standard Newton method for large-scale data classification. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08997667
Volume :
27
Issue :
8
Database :
Academic Search Index
Journal :
Neural Computation
Publication Type :
Academic Journal
Accession number :
108474660
Full Text :
https://doi.org/10.1162/NECO_a_00751