Information source (mathematics) explained

In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.

The uncertainty, or entropy rate, of an information source is defined as

H\{X\}=\limn\toinftyH(Xn|X0,X1,...,Xn-1)

where

X0,X1,...,Xn

is the sequence of random variables defining the information source, and

H(Xn|X0,X1,...,Xn-1)

is the conditional information entropy of the sequence of random variables. Equivalently, one has

H\{X\}=\limn\toinfty

H(X0,X1,...,Xn-1,Xn)
n+1

.

See also

References