Back to Search
Start Over
Scaling Up Kernel SVM on Limited Resources: A Low-Rank Linearization Approach.
- Source :
- IEEE Transactions on Neural Networks & Learning Systems; Feb2019, Vol. 30 Issue 2, p369-378, 10p
- Publication Year :
- 2019
-
Abstract
- Kernel support vector machines (SVMs) deliver state-of-the-art results in many real-world nonlinear classification problems, but the computational cost can be quite demanding in order to maintain a large number of support vectors. Linear SVM, on the other hand, is highly scalable to large data but only suited for linearly separable problems. In this paper, we propose a novel approach called low-rank linearized SVM to scale up kernel SVM on limited resources. Our approach transforms a nonlinear SVM to a linear one via an approximate empirical kernel map computed from efficient kernel low-rank decompositions. We theoretically analyze the gap between the solutions of the approximate and optimal rank- $k$ kernel map, which in turn provides guidance on the sampling scheme of the Nyström approximation. Furthermore, we extend it to a semisupervised metric learning scenario in which partially labeled samples can be exploited to further improve the quality of the low-rank embedding. Our approach inherits rich representability of kernel SVM and high efficiency of linear SVM. Experimental results demonstrate that our approach is more robust and achieves a better tradeoff between model representability and scalability against state-of-the-art algorithms for large-scale SVMs. [ABSTRACT FROM AUTHOR]
- Subjects :
- SUPPORT vector machines
DECOMPOSITION method
APPROXIMATION theory
Subjects
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 30
- Issue :
- 2
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 134278823
- Full Text :
- https://doi.org/10.1109/TNNLS.2018.2838140