Back to Search Start Over

Analysis of Remaining Uncertainties and Exponents Under Various Conditional Rényi Entropies.

Authors :
Tan, Vincent Y. F.
Hayashi, Masahito
Source :
IEEE Transactions on Information Theory; May2018, Vol. 64 Issue 5, p3734-3755, 22p
Publication Year :
2018

Abstract

We analyze the asymptotics of the normalized remaining uncertainty of a source when a compressed or hashed version of it and correlated side information is observed. For this system, commonly known as Slepian–Wolf source coding, we establish the optimal (minimum) rate of compression of the source to ensure that the remaining uncertainties vanish. We also study the exponential rate of decay of the remaining uncertainty to zero when the rate is above the optimal rate of compression. In this paper, we consider various classes of random universal hash functions. Instead of measuring remaining uncertainties using traditional Shannon information measures, we do so using two forms of the conditional Rényi entropy. Among other techniques, we employ new one-shot bounds and the moments of type class enumerator method (see Merhav) for these evaluations. We show that these asymptotic results are generalizations of the strong converse exponent and the error exponent of the Slepian–Wolf problem under maximum a posteriori decoding. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
00189448
Volume :
64
Issue :
5
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
129266217
Full Text :
https://doi.org/10.1109/TIT.2018.2792495