Back to Search Start Over

Average-Case Matrix Discrepancy: Asymptotics and Online Algorithms

Authors :
Kunisky, Dmitriy
Zhang, Peiyuan
Publication Year :
2023

Abstract

We study the operator norm discrepancy of i.i.d. random matrices, initiating the matrix-valued analog of a long line of work on the $\ell^{\infty}$ norm discrepancy of i.i.d. random vectors. First, using repurposed results on vector discrepancy and new first moment method calculations, we give upper and lower bounds on the discrepancy of random matrices. We treat i.i.d. matrices drawn from the Gaussian orthogonal ensemble (GOE) and low-rank Gaussian Wishart distributions. In both cases, for what turns out to be the "critical" number of $\Theta(n^2)$ matrices of dimension $n \times n$, we identify the discrepancy up to constant factors. Second, we give a new analysis of the matrix hyperbolic cosine algorithm of Zouzias (2011), a matrix version of an online vector discrepancy algorithm of Spencer (1977) studied for average-case inputs by Bansal and Spencer (2020), for the case of i.i.d. random matrix inputs. We both give a general analysis and extract concrete bounds on the discrepancy achieved by this algorithm for matrices with independent entries (including GOE matrices) and Gaussian Wishart matrices.<br />Comment: 34 pages. Some asymptotic results have been strengthened to include upper bounds (hence the change in title); the asymptotic result for general distributions (formerly Theorem 1.14, now Theorem 1.10) and its application to Wishart matrices (formerly Corollary 1.15, now Corollary 1.13) have been reformulated with a substantially different proof after an error was pointed out by Peng Zhang

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2307.10055
Document Type :
Working Paper