FIND ME ON

GitHub

LinkedIn

Entropy Rate

🌱

Definition
InfoTheory

The entropy rate of a source {Xi}i=1\{X_{i}\}^{\infty}_{i=1} with alphabet X\mathcal{X} is denoted by H(X)H(\mathcal{X}) and defined by H(X):=limn1nH(Xn)H(\mathcal{X}):=\lim_{n\to\infty} \frac{1}{n}H(X^{n})provided that the limit exists.

If source {Xi}i=1\{X_{i}\}^{\infty}_{i=1} is a DMS (i.e. iid), then H(X)=limn1nH(X1,,Xn)=limn1n(nH(X1))=H(X1)H(\mathcal{X})=\lim_{n\to\infty} \frac{1}{n}H(X_{1},\cdots,X_{n})=\lim_{n\to\infty} \frac{1}{n}(nH(X_{1}))=H(X_{1})

For a stationary source {Xi}i=1\{X_{i}\}^{\infty}_{i=1}, the sequence of conditional entropies {H(XiXi1)}i=1\{H(X_{i}|X^{i-1})\}^{\infty}_{i=1} is decreasing in ii and has a limit denoted by H~(X):=limiH(XiXi1)\tilde H(\mathcal{X}):=\lim_{i\to\infty}H(X_{i}|X^{i-1})

For a stationary source {Xi}i=1\{X_{i}\}^{\infty}_{i=1}, its entropy rate H(X)H(\mathcal{X}) always exists and is equal to H~(X)\tilde H(\mathcal{X}): H(X)=H~(X)=limnH(XnXn1)H(\mathcal{X})=\tilde H(\mathcal{X})=\lim_{n\to\infty}H(X_{n}|X^{n-1})

- Since H~(X)=H(X)\tilde H(\mathcal{X})=H(\mathcal{X}) and H(XnXn1)H~(X)H(X_{n}|X^{n-1})\downarrow\tilde H(\mathcal{X}) by lemma 1, then we have H(X)H(XnXn1) n1H(\mathcal{X})\le H(X_{n}|X^{n-1}) \ \forall n\ge 1 - Can also be shown that 1nH(Xn)\frac{1}{n}H(X^{n}) is decreasing in nn H(X)1nH(Xn) n1H(\mathcal{X})\le \frac{1}{n}H(X^{n}) \ \forall n\ge1

Linked from