1. A New Statistic for Bayesian Hypothesis Testing
- Author
-
Stephen G. Walker and Su Chen
- Subjects
Statistics and Probability ,Economics and Econometrics ,Bayes factor ,Statistics::Computation ,Frequentist inference ,Statistics ,Prior probability ,Test statistic ,Statistics::Methodology ,Statistics, Probability and Uncertainty ,Divergence (statistics) ,Statistic ,Type I and type II errors ,Mathematics ,Statistical hypothesis testing - Abstract
A new Bayesian–inspired statistic for hypothesis testing is proposed which compares two posterior distributions; the observed posterior and the expected posterior under the null model. The Kullback–Leibler divergence between the two posterior distributions yields a test statistic which can be interpreted as a penalized log–Bayes factor with the penalty term converging to a constant as the sample size increases. Hence, asymptotically, the statistic behaves as a Bayes factor. Viewed as a penalized Bayes factor, this approach solves the long standing issue of using improper priors with the Bayes factor, since only posterior summaries are needed for the new statistic. Further motivation for the new statistic is a minimal move from the Bayes factor which requires no tuning nor splitting of data into training and inference, and can use improper priors. Critical regions for the test can be assessed using frequentist notions of Type I error.
- Published
- 2023