Back to Search Start Over

Preconditioning without a preconditioner: faster ridge-regression and Gaussian sampling with randomized block Krylov subspace methods

Authors :
Chen, Tyler
Huber, Caroline
Lin, Ethan
Zaid, Hajar
Publication Year :
2025

Abstract

We describe a randomized variant of the block conjugate gradient method for solving a single positive-definite linear system of equations. Our method provably outperforms preconditioned conjugate gradient with a broad-class of Nystr\"om-based preconditioners, without ever explicitly constructing a preconditioner. In analyzing our algorithm, we derive theoretical guarantees for new variants of Nystr\"om preconditioned conjugate gradient which may be of separate interest. We also describe how our approach yields state-of-the-art algorithms for key data-science tasks such as computing the entire ridge regression regularization path and generating multiple independent samples from a high-dimensional Gaussian distribution.

Subjects

Subjects :
Mathematics - Numerical Analysis

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.18717
Document Type :
Working Paper