Back to Search
Start Over
Posterior Averaging Information Criterion.
- Source :
-
Entropy . Mar2023, Vol. 25 Issue 3, p468. 18p. - Publication Year :
- 2023
-
Abstract
- We propose a new model selection method, named the posterior averaging information criterion, for Bayesian model assessment to minimize the risk of predicting independent future observations. The theoretical foundation is built on the Kullback–Leibler divergence to quantify the similarity between the proposed candidate model and the underlying true model. From a Bayesian perspective, our method evaluates the candidate models over the entire posterior distribution in terms of predicting a future independent observation. Without assuming that the true distribution is contained in the candidate models, the new criterion is developed by correcting the asymptotic bias of the posterior mean of the in-sample log-likelihood against out-of-sample log-likelihood, and can be generally applied even for Bayesian models with degenerate non-informative priors. Simulations in both normal and binomial settings demonstrate superior small sample performance. [ABSTRACT FROM AUTHOR]
- Subjects :
- *RISK assessment
*PREDICTION models
Subjects
Details
- Language :
- English
- ISSN :
- 10994300
- Volume :
- 25
- Issue :
- 3
- Database :
- Academic Search Index
- Journal :
- Entropy
- Publication Type :
- Academic Journal
- Accession number :
- 162812626
- Full Text :
- https://doi.org/10.3390/e25030468