Back to Search Start Over

Low-Rank Kernel Learning with Bregman Matrix Divergences.

Authors :
Kulis, Brian
Sustik, Mátyás A.
Dhillon, Inderjit S.
Source :
Journal of Machine Learning Research. 2/1/2009, Vol. 10 Issue 2, p341-376. 36p. 3 Diagrams, 6 Graphs.
Publication Year :
2009

Abstract

In this paper, we study low-rank matrix nearness problems, with a focus on learning low-rank positive semidefinite (kernel) matrices for machine learning applications. We propose efficient algorithms that scale linearly in the number of data points and quadratically in the rank of the input matrix. Existing algorithms for learning kernel matrices often scale poorly, with running times that are cubic in the number of data points. We employ Bregman matrix divergences as the measures of nearness--these divergences are natural for learning low-rank kernels since they preserve rank as well as positive semidefiniteness. Special cases of our framework yield faster algorithms for various existing learning problems, and experimental results demonstrate that our algorithms can effectively learn both low-rank and full-rank kernel matrices. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15324435
Volume :
10
Issue :
2
Database :
Academic Search Index
Journal :
Journal of Machine Learning Research
Publication Type :
Academic Journal
Accession number :
58617873