Back to Search
Start Over
An intuition for physicists: information gain from experiments
- Publication Year :
- 2022
- Publisher :
- arXiv, 2022.
-
Abstract
- How much one has learned from an experiment is quantifiable by the information gain, also known as the Kullback-Leibler divergence. The narrowing of the posterior parameter distribution $P(\theta|D)$ compared with the prior parameter distribution $\pi(\theta)$, is quantified in units of bits, as: $ D_{\mathrm{KL}}(P|\pi)=\int\log_{2}\left(\frac{P(\theta|D)}{\pi(\theta)}\right)\,P(\theta|D)\,d\theta $. This research note gives an intuition what one bit of information gain means. It corresponds to a Gaussian shrinking its standard deviation by a factor of three.<br />Comment: Accepted to RNAAS; Corrected typos (Thanks to Tariq Yasin and Torsten En{\ss}lin)
- Subjects :
- Statistical Mechanics (cond-mat.stat-mech)
Physics - Data Analysis, Statistics and Probability
FOS: Physical sciences
General Medicine
Astrophysics - Instrumentation and Methods for Astrophysics
Instrumentation and Methods for Astrophysics (astro-ph.IM)
Condensed Matter - Statistical Mechanics
Data Analysis, Statistics and Probability (physics.data-an)
Subjects
Details
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....e1e34b14a604f1de6a0d8a15a8943c45
- Full Text :
- https://doi.org/10.48550/arxiv.2205.00009