Back to Search Start Over

Density-Dependent Quantized Least Squares Support Vector Machine for Large Data Sets.

Authors :
Nan, Shengyu
Sun, Lei
Chen, Badong
Lin, Zhiping
Toh, Kar-Ann
Source :
IEEE Transactions on Neural Networks & Learning Systems; Jan2017, Vol. 28 Issue 1, p94-106, 13p
Publication Year :
2017

Abstract

Based on the knowledge that input data distribution is important for learning, a data density-dependent quantization scheme (DQS) is proposed for sparse input data representation. The usefulness of the representation scheme is demonstrated by using it as a data preprocessing unit attached to the well-known least squares support vector machine (LS-SVM) for application on big data sets. Essentially, the proposed DQS adopts a single shrinkage threshold to obtain a simple quantization scheme, which adapts its outputs to input data density. With this quantization scheme, a large data set is quantized to a small subset where considerable sample size reduction is generally obtained. In particular, the sample size reduction can save significant computational cost when using the quantized subset for feature approximation via the Nyström method. Based on the quantized subset, the approximated features are incorporated into LS-SVM to develop a data density-dependent quantized LS-SVM (DQLS-SVM), where an analytic solution is obtained in the primal solution space. The developed DQLS-SVM is evaluated on synthetic and benchmark data with particular emphasis on large data sets. Extensive experimental results show that the learning machine incorporating DQS attains not only high computational efficiency but also good generalization performance. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
2162237X
Volume :
28
Issue :
1
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
120458976
Full Text :
https://doi.org/10.1109/TNNLS.2015.2504382