Back to Search Start Over

Iterative Nearest Neighbors.

Authors :
Timofte, Radu
Van Gool, Luc
Source :
Pattern Recognition. Jan2015, Vol. 48 Issue 1, p60-72. 13p.
Publication Year :
2015

Abstract

Representing data as a linear combination of a set of selected known samples is of interest for various machine learning applications such as dimensionality reduction or classification. k -Nearest Neighbors ( k NN) and its variants are still among the best-known and most often used techniques. Some popular richer representations are Sparse Representation (SR) based on solving an l 1 -regularized least squares formulation, Collaborative Representation (CR) based on l 2 -regularized least squares, and Locally Linear Embedding (LLE) based on an l 1 -constrained least squares problem. We propose a novel sparse representation, the Iterative Nearest Neighbors (INN). It combines the power of SR and LLE with the computational simplicity of k NN. We empirically validate our representation in terms of sparse support signal recovery and compare with similar Matching Pursuit (MP) and Orthogonal Matching Pursuit (OMP), two other iterative methods. We also test our method in terms of dimensionality reduction and classification, using standard benchmarks for faces (AR), traffic signs (GTSRB), and objects (PASCAL VOC 2007). INN compares favorably to NN, MP, and OMP, and on par with CR and SR, while being orders of magnitude faster than the latter. On the downside, INN does not scale well with higher dimensionalities of the data. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
48
Issue :
1
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
98809429
Full Text :
https://doi.org/10.1016/j.patcog.2014.07.011