1. Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints
- Author
-
Kay, Jim W. and Ince, Robin A.A.
- Subjects
FOS: Computer and information sciences ,maximum entropy ,Computer science ,Computer Science - Information Theory ,Gaussian ,General Physics and Astronomy ,FOS: Physical sciences ,lcsh:Astrophysics ,Multivariate normal distribution ,Machine Learning (stat.ML) ,unique information ,Quantitative Biology - Quantitative Methods ,01 natural sciences ,Article ,010305 fluids & plasmas ,03 medical and health sciences ,symbols.namesake ,0302 clinical medicine ,Statistics - Machine Learning ,lcsh:QB460-466 ,0103 physical sciences ,Applied mathematics ,dependency constraints ,Graphical model ,lcsh:Science ,mutual information ,Condensed Matter - Statistical Mechanics ,Quantitative Methods (q-bio.QM) ,partial information decomposition ,Statistical Mechanics (cond-mat.stat-mech) ,Principle of maximum entropy ,Information Theory (cs.IT) ,Univariate ,Probability and statistics ,Mutual information ,Gaussian graphical models ,lcsh:QC1-999 ,Physics - Data Analysis, Statistics and Probability ,FOS: Biological sciences ,symbols ,lcsh:Q ,Marginal distribution ,lcsh:Physics ,030217 neurology & neurosurgery ,Data Analysis, Statistics and Probability (physics.data-an) - Abstract
The Partial Information Decomposition (PID) [arXiv:1004.2515] provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method (Idep) has recently been proposed for computing a two-predictor PID over discrete spaces. [arXiv:1709.06653] A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the Idep approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the Idep PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the Idep PID with the minimum mutual information PID (Immi). [arXiv:1411.2832] In particular, it is proved that the Immi method generally produces larger estimates of redundancy and synergy than does the Idep method. In discussion of the practical examples, the PIDs are complemented by the use of deviance tests for the comparison of Gaussian graphical models., Comment: 39 pages, 9 figures, 9 tables
- Published
- 2018
- Full Text
- View/download PDF