Back to Search Start Over

Mixture Models, Bayes Fisher Information, and Divergence Measures.

Authors :
Asadi, Majid
Kharazmi, Omid
Ebrahimi, Nader
Soofi, Ehsan S.
Source :
IEEE Transactions on Information Theory. Apr2019, Vol. 65 Issue 4, p2316-2321. 6p.
Publication Year :
2019

Abstract

This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometric, and generalized mixtures of two probability density functions. The Fisher information of the arithmetic mixture about the mixing parameter is related to chi-square divergence, Shannon entropy, and the Jensen–Shannon divergence. The Bayes Fisher measures of the three mixture models are related to the Kullback–Leibler, Jeffreys, Jensen–Shannon, Rényi, and Tsallis divergences. These measures indicate that the farther away are the components from each other, the more informative are data about the mixing parameter. We also unify three different relative entropy derivations of the geometric mixture scattered in statistics and physics literatures. Extensions of two of the formulations to the minimization of Tsallis divergence give the generalized mixture as the solution. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
65
Issue :
4
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
135443223
Full Text :
https://doi.org/10.1109/TIT.2018.2877608