Back to Search Start Over

Fixed-rank matrix factorizations and Riemannian low-rank optimization.

Authors :
Mishra, Bamdev
Meyer, Gilles
Bonnabel, Silvère
Sepulchre, Rodolphe
Source :
Computational Statistics. Jun2014, Vol. 29 Issue 3/4, p591-621. 31p.
Publication Year :
2014

Abstract

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian quotient geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems, and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss the usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09434062
Volume :
29
Issue :
3/4
Database :
Academic Search Index
Journal :
Computational Statistics
Publication Type :
Academic Journal
Accession number :
96286695
Full Text :
https://doi.org/10.1007/s00180-013-0464-z