Back to Search Start Over

From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation.

Authors :
Rosset, Saharon
Tibshirani, Ryan J.
Source :
Journal of the American Statistical Association. Mar2020, Vol. 115 Issue 529, p138-151. 14p.
Publication Year :
2020

Abstract

In statistical prediction, classical approaches for model selection and model evaluation based on covariance penalties are still widely used. Most of the literature on this topic is based on what we call the "Fixed-X" assumption, where covariate values are assumed to be nonrandom. By contrast, it is often more reasonable to take a "Random-X" view, where the covariate values are independently drawn for both training and prediction. To study the applicability of covariance penalties in this setting, we propose a decomposition of Random-X prediction error in which the randomness in the covariates contributes to both the bias and variance components. This decomposition is general, but we concentrate on the fundamental case of ordinary least-squares (OLS) regression. We prove that in this setting the move from Fixed-X to Random-X prediction results in an increase in both bias and variance. When the covariates are normally distributed and the linear model is unbiased, all terms in this decomposition are explicitly computable, which yields an extension of Mallows' Cp that we call RCp. RCp also holds asymptotically for certain classes of nonnormal covariates. When the noise variance is unknown, plugging in the usual unbiased estimate leads to an approach that we call RCp ^ , which is closely related to Sp, and generalized cross-validation (GCV). For excess bias, we propose an estimate based on the "shortcut-formula" for ordinary cross-validation (OCV), resulting in an approach we call RCp+. Theoretical arguments and numerical simulations suggest that RCp+ is typically superior to OCV, though the difference is small. We further examine the Random-X error of other popular estimators. The surprising result we get for ridge regression is that, in the heavily regularized regime, Random-X variance is smaller than Fixed-X variance, which can lead to smaller overall Random-X error. Supplementary materials for this article are available online. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01621459
Volume :
115
Issue :
529
Database :
Academic Search Index
Journal :
Journal of the American Statistical Association
Publication Type :
Academic Journal
Accession number :
142372899
Full Text :
https://doi.org/10.1080/01621459.2018.1424632