1. A limit theorem for the entropy density of nonhomogeneous Markov information source
- Author
-
Yang Weiguo and Liu Wen
- Subjects
Statistics and Probability ,Kullback–Leibler divergence ,Markov process ,Markov information source ,Combinatorics ,symbols.namesake ,Entropy density ,symbols ,Entropy (information theory) ,Probability distribution ,Almost everywhere ,Statistics, Probability and Uncertainty ,Alphabet ,Mathematics - Abstract
Let Xn, n ⩾ 0 be a sequence of successive letters produced by a nonhomogeneous Markov information source with alphabet S = 1,2, …,m, and the probability distribution p(x0)Πkn = 1 pk(xk−1, xk), where pk(i,j) is the transition probability P(Xk = j|Xk−1 = i). Let fn(ω) be the relative entropy density of Xk, 0 ⩽ k ⩽ n. In this paper we prove that for an arbitrary nonhomogeneous Markov information source, fn(ω) and (1/n)Σk = 1n H[pk(Xk−1, 1), …, pk(Xk−1, m)] are asymptotically equal almost everywhere as n → ∞, where H(p1, …, pm) is the entropy of the distribution p1, …, pm.
- Published
- 1995
- Full Text
- View/download PDF