Back to Search
Start Over
A Neural Difference-of-Entropies Estimator for Mutual Information
- Publication Year :
- 2025
-
Abstract
- Estimating Mutual Information (MI), a key measure of dependence of random quantities without specific modelling assumptions, is a challenging problem in high dimensions. We propose a novel mutual information estimator based on parametrizing conditional densities using normalizing flows, a deep generative model that has gained popularity in recent years. This estimator leverages a block autoregressive structure to achieve improved bias-variance trade-offs on standard benchmark tasks.<br />Comment: 23 pages, 17 figures
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2502.13085
- Document Type :
- Working Paper