Back to Search
Start Over
Escaping the curse of dimensionality in similarity learning: Efficient Frank-Wolfe algorithm and generalization bounds
- Source :
- Neurocomputing, Neurocomputing, 2019, 333, pp.185-199. ⟨10.1016/j.neucom.2018.12.060⟩, Neurocomputing, Elsevier, 2019, 333, pp.185-199. ⟨10.1016/j.neucom.2018.12.060⟩
- Publication Year :
- 2019
- Publisher :
- Elsevier BV, 2019.
-
Abstract
- Similarity and metric learning provides a principled approach to construct a task-specific similarity from weakly supervised data. However, these methods are subject to the curse of dimensionality: as the number of features grows large, poor generalization is to be expected and training becomes intractable due to high computational and memory costs. In this paper, we propose a similarity learning method that can efficiently deal with high-dimensional sparse data. This is achieved through a parameterization of similarity functions by convex combinations of sparse rank-one matrices, together with the use of a greedy approximate Frank-Wolfe algorithm which provides an efficient way to control the number of active features. We show that the convergence rate of the algorithm, as well as its time and memory complexity, are independent of the data dimension. We further provide a theoretical justification of our modeling choices through an analysis of the generalization error, which depends logarithmically on the sparsity of the solution rather than on the number of features. Our experiments on datasets with up to one million features demonstrate the ability of our approach to generalize well despite the high dimensionality as well as its superiority compared to several competing methods.<br />Comment: Long version of arXiv:1411.2374 (AISTATS 2015), to appear in Neurocomputing. Matlab code: https://github.com/bellet/HDSL
- Subjects :
- Computer Science - Machine Learning
0209 industrial biotechnology
Theoretical computer science
Computer Science - Artificial Intelligence
Computer science
Generalization
Metric learning
Cognitive Neuroscience
02 engineering and technology
Matrix (mathematics)
020901 industrial engineering & automation
Frank–Wolfe algorithm
[INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]
[STAT.ML]Statistics [stat]/Machine Learning [stat.ML]
Similarity (network science)
Statistics - Machine Learning
Artificial Intelligence
0202 electrical engineering, electronic engineering, information engineering
Sparse matrix
Generalization error
Computer Science Applications
Frank-Wolfe algorithm
Rate of convergence
Generalization bounds
Metric (mathematics)
020201 artificial intelligence & image processing
Similarity learning
Curse of dimensionality
Subjects
Details
- ISSN :
- 09252312
- Volume :
- 333
- Database :
- OpenAIRE
- Journal :
- Neurocomputing
- Accession number :
- edsair.doi.dedup.....a759532b9b59571c0f74428f1c5fcf5a
- Full Text :
- https://doi.org/10.1016/j.neucom.2018.12.060