Back to Search Start Over

Tail Bounds for All Eigenvalues of a Sum of Random Matrices

Authors :
Gittens, Alex A.
Tropp, Joel A.
Publication Year :
2024
Publisher :
California Institute of Technology, 2024.

Abstract

This work introduces the minimax Laplace transform method, a modification of the cumulant-based matrix Laplace transform method developed in [Tro11c] that yields both upper and lower bounds on each eigenvalue of a sum of random self-adjoint matrices. This machinery is used to derive eigenvalue analogs of the classical Chernoff, Bennett, and Bernstein bounds. Two examples demonstrate the efficacy of the minimax Laplace transform. The first concerns the effects of column sparsification on the spectrum of a matrix with orthonormal rows. Here, the behavior of the singular values can be described in terms of coherence-like quantities. The second example addresses the question of relative accuracy in the estimation of eigenvalues of the covariance matrix of a random process. Standard results on the convergence of sample covariance matrices provide bounds on the number of samples needed to obtain relative accuracy in the spectral norm, but these results only guarantee relative accuracy in the estimate of the maximum eigenvalue. The minimax Laplace transform argument establishes that if the lowest eigenvalues decay sufficiently fast, Ω(ε^(-2)κ^2_ℓ ℓ log p) samples, where κ_ℓ = λ_1(C)/λ_ℓ(C), are sufficient to ensure that the dominant ℓ eigenvalues of the covariance matrix of a N(0,C) random vector are estimated to within a factor of 1 ± ε with high probability.<br />A 20140828-084239607

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi...........c45c71bf587a4474f47babee24aeff9a
Full Text :
https://doi.org/10.7907/tz8n-h623