1. On Variational Bayes Estimation and Variational Information Criteria for Linear Regression Models
- Author
-
Samuel Müller, John T. Ormerod, and Chong You
- Subjects
Statistics and Probability ,Deviance information criterion ,Mathematical optimization ,Frequentist inference ,Bayesian information criterion ,Prior probability ,Statistical inference ,Statistics, Probability and Uncertainty ,Akaike information criterion ,Bayesian linear regression ,Bayesian inference ,Mathematics - Abstract
Summary Variational Bayes (VB) estimation is a fast alternative to Markov Chain Monte Carlo for performing approximate Baesian inference. This procedure can be an efficient and effective means of analyzing large datasets. However, VB estimation is often criticised, typically on empirical grounds, for being unable to produce valid statistical inferences. In this article we refute this criticism for one of the simplest models where Bayesian inference is not analytically tractable, that is, the Bayesian linear model (for a particular choice of priors). We prove that under mild regularity conditions, VB based estimators enjoy some desirable frequentist properties such as consistency and can be used to obtain asymptotically valid standard errors. In addition to these results we introduce two VB information criteria: the variational Akaike information criterion and the variational Bayesian information criterion. We show that variational Akaike information criterion is asymptotically equivalent to the frequentist Akaike information criterion and that the variational Bayesian information criterion is first order equivalent to the Bayesian information criterion in linear regression. These results motivate the potential use of the variational information criteria for more complex models. We support our theoretical results with numerical examples.
- Published
- 2014