1. Flat minima generalize for low-rank matrix recovery.
- Author
-
Ding, Lijun, Drusvyatskiy, Dmitriy, Fazel, Maryam, and Harchaoui, Zaid
- Subjects
- *
LOW-rank matrices , *COVARIANCE matrices , *PRINCIPAL components analysis - Abstract
Empirical evidence suggests that for a variety of overparameterized nonlinear models, most notably in neural network training, the growth of the loss around a minimizer strongly impacts its performance. Flat minima—those around which the loss grows slowly—appear to generalize well. This work takes a step towards understanding this phenomenon by focusing on the simplest class of overparameterized nonlinear models: those arising in low-rank matrix recovery. We analyse overparameterized matrix and bilinear sensing, robust principal component analysis, covariance matrix estimation and single hidden layer neural networks with quadratic activation functions. In all cases, we show that flat minima, measured by the trace of the Hessian, exactly recover the ground truth under standard statistical assumptions. For matrix completion, we establish weak recovery, although empirical evidence suggests exact recovery holds here as well. We complete the paper with synthetic experiments that illustrate our findings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF