Back to Search Start Over

Understanding Interdependency Through Complex Information Sharing.

Authors :
Rosas, Fernando
Ntranos, Vasilis
Ellison, Christopher J.
Pollin, Sofie
Verhelst, Marian
Source :
Entropy. 2016, Vol. 18 Issue 2, p38. 27p.
Publication Year :
2016

Abstract

The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
18
Issue :
2
Database :
Academic Search Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
113300309
Full Text :
https://doi.org/10.3390/e18020038