Back to Search
Start Over
A general divergence criterion for prior selection.
- Source :
- Annals of the Institute of Statistical Mathematics; Feb2011, Vol. 63 Issue 1, p43-58, 16p
- Publication Year :
- 2011
-
Abstract
- The paper revisits the problem of selection of priors for regular one-parameter family of distributions. The goal is to find some 'objective' or 'default' prior by approximate maximization of the distance between the prior and the posterior under a general divergence criterion as introduced by Amari (Ann Stat 10:357-387, 1982) and Cressie and Read (J R Stat Soc Ser B 46:440-464, 1984). The maximization is based on an asymptotic expansion of this distance. The Kullback-Leibler, Bhattacharyya-Hellinger and Chi-square divergence are special cases of this general divergence criterion. It is shown that with the exception of one particular case, namely the Chi-square divergence, the general divergence criterion yields Jeffreys' prior. For the Chi-square divergence, we obtain a prior different from that of Jeffreys and also from that of Clarke and Sun (Sankhya Ser A 59:215-231, 1997). [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00203157
- Volume :
- 63
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Annals of the Institute of Statistical Mathematics
- Publication Type :
- Academic Journal
- Accession number :
- 56794847
- Full Text :
- https://doi.org/10.1007/s10463-009-0226-4