Back to Search
Start Over
Accelerated Monte Carlo for Kullback-Leibler divergence between Gaussian mixture models
- Source :
- ICASSP
- Publication Year :
- 2008
- Publisher :
- IEEE, 2008.
-
Abstract
- Kullback Leibler (KL) divergence is widely used as a measure of dissimilarity between two probability distributions; however, the required integral is not tractable for gaussian mixture models (GMMs), and naive Monte-Carlo sampling methods can be expensive. Our work aims to improve the estimation of KL divergence for QMMs by sampling methods. We show how to accelerate Monte-Carlo sampling using variational approximations of the KL divergence. To this end we employ two different methodologies, control variates, and importance sampling. With control variates we use sampling to estimate the difference between the variational approximation and the the unknown KL divergence. With importance sampling, we estimate the KL divergence directly, using a sampling distribution derived from the variational approximation. We show that with these techniques we can achieve improvements in accuracy equivalent to using a factor of 30 times more samples.
Details
- ISSN :
- 15206149
- Database :
- OpenAIRE
- Journal :
- 2008 IEEE International Conference on Acoustics, Speech and Signal Processing
- Accession number :
- edsair.doi...........6d7d619a09a8076c87d48aaf18786a41
- Full Text :
- https://doi.org/10.1109/icassp.2008.4518669