Back to Search
Start Over
Scalable kernel-based learning via low-rank approximation of lifted data
- Source :
- Allerton
- Publication Year :
- 2017
- Publisher :
- IEEE, 2017.
-
Abstract
- Despite their well-documented capability in modeling nonlinear functions, kernel methods fall short in large-scale learning tasks due to their excess memory and computational requirements. The present work introduces a novel kernel approximation approach from a dimensionality reduction point of view on virtual lifted data. The proposed framework accommodates feature extraction while considering limited storage and computational availability, and subsequently provides kernel approximation by a linear inner-product over the extracted features. Probabilistic guarantees on the generalization of the proposed task is provided, and efficient solvers with provable convergence guarantees are developed. By introducing a sampling step which precedes the dimensionality reduction task, the framework is further broadened to accommodate learning over large datasets. The connection between the novel method and Nystrom kernel approximation algorithm with its modifications is also presented. Empirical tests validate the effectiveness of the proposed approach.
- Subjects :
- Computer science
Dimensionality reduction
Feature extraction
Probabilistic logic
020206 networking & telecommunications
Low-rank approximation
02 engineering and technology
010501 environmental sciences
01 natural sciences
Kernel method
Kernel (statistics)
Scalability
Convergence (routing)
0202 electrical engineering, electronic engineering, information engineering
Algorithm
0105 earth and related environmental sciences
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- Accession number :
- edsair.doi...........0914ae7e7af2badfc97d4def47339dc5
- Full Text :
- https://doi.org/10.1109/allerton.2017.8262791