Back to Search Start Over

Tracking Sparse Linear Classifiers.

Authors :
Zhai, Tingting
Koriche, Frederic
Wang, Hao
Gao, Yang
Source :
IEEE Transactions on Neural Networks & Learning Systems. Jul2019, Vol. 30 Issue 7, p2079-2092. 14p.
Publication Year :
2019

Abstract

In this paper, we investigate the problem of sparse online linear classification in changing environments. We first analyze the tracking performance of standard online linear classifiers, which use gradient descent for minimizing the regularized hinge loss. The derived shifting bounds highlight the importance of choosing appropriate step sizes in the presence of concept drifts. Notably, we show that a better adaptability to concept drifts can be achieved using constant step sizes rather than the state-of-the-art decreasing step sizes. Based on these observations, we then propose a novel sparse approximated linear classifier, called sparse approximated linear classification (SALC), which uses a constant step size. In essence, SALC simply rounds small weights to zero for achieving sparsity and controls the truncation error in a principled way for achieving a low tracking regret. The degree of sparsity obtained by SALC is continuous and can be controlled by a parameter which captures the tradeoff between the sparsity of the model and the regret performance of the algorithm. Experiments on nine stationary data sets show that SALC is superior to the state-of-the-art sparse online learning algorithms, especially when the solution is required to be sparse; on seven groups of nonstationary data sets with various total shifting amounts, SALC also presents a good ability to track drifts. When wrapped with a drift detector, SALC achieves a remarkable tracking performance regardless of the total shifting amount. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
30
Issue :
7
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
137117532
Full Text :
https://doi.org/10.1109/TNNLS.2018.2877433