In the mathematical theory of probability, the entropy rate or source information rate of a is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:
when the limit exists. An alternative, related quantity is:
For strongly stationary stochastic processes, . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property.
Since a stochastic process defined by a Markov chain that is irreducible, aperiodic and positive recurrent has a stationary distribution, the entropy rate is independent of the initial distribution.
For example, for such a Markov chain Yk defined on a countable number of states, given the transition matrix Pij, H(Y) is given by: