Back to Search Start Over

Near-optimal sample complexity for convex tensor completion.

Authors :
Ghadermarzy, Navid
Plan, Yaniv
Yilmaz, Özgür
Source :
Information & Inference: A Journal of the IMA. Sep2019, Vol. 8 Issue 3, p577-619. 43p.
Publication Year :
2019

Abstract

We study the problem of estimating a low-rank tensor when we have noisy observations of a subset of its entries. A rank- |$r$|⁠ , order- |$d$|⁠ , |$N \times N \times \cdots \times N$| tensor, where |$r=O(1)$|⁠ , has |$O(dN)$| free variables. On the other hand, prior to our work, the best sample complexity that was achieved in the literature is |$O\left(N^{\frac{d}{2}}\right)$|⁠ , obtained by solving a tensor nuclear-norm minimization problem. In this paper, we consider the 'M-norm', an atomic norm whose atoms are rank-1 sign tensors. We also consider a generalization of the matrix max-norm to tensors, which results in a quasi-norm that we call 'max-qnorm'. We prove that solving an M-norm constrained least squares (LS) problem results in nearly optimal sample complexity for low-rank tensor completion (TC). A similar result holds for max-qnorm as well. Furthermore, we show that these bounds are nearly minimax rate-optimal. We also provide promising numerical results for max-qnorm constrained TC, showing improved recovery compared to matricization and alternating LS. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20498764
Volume :
8
Issue :
3
Database :
Academic Search Index
Journal :
Information & Inference: A Journal of the IMA
Publication Type :
Academic Journal
Accession number :
138621845
Full Text :
https://doi.org/10.1093/imaiai/iay019