FIND ME ON

GitHub

LinkedIn

Shannon Coding Theorem

🌱

Theorem
InfoTheory

fixed length dms

Given integer D2D\ge2, consider a DMS {Xi}i=1\{X_i\}_{i=1}^\infty with alphabet X\mathcal{X}, pmf pXp_X and source entropy HD(X)=aXpX(a)logDpX(a)=(logD2)H(X)H_D(X)=-\sum\limits_{a\in\mathcal{X}}p_X(a)\log_Dp_X(a)=(log_D2)H(X)then the following results hold: 1. Forward Part (Achievability): For any ϵ(0,1)\epsilon\in(0,1) and 0<δ<ϵ0<\delta<\epsilon, there exists a sequence of (K,n)(K,n) D-ary fixed-length codes Cn\mathcal{C}_n for the source such that HD(X)<lim supnknHD(X)+δH_D(X)<\limsup_{n\to\infty} \frac{k}{n}\le H_D(X)+\delta and Pe(Cn)<ϵ\mboxfornsufficientlylargeP_e(\mathcal{C}_n)<\epsilon \mbox{ for n sufficiently large} 2. “Strong” Converse Part: For any ϵ(0,1)\epsilon\in(0,1) and any sequence of (K,n)(K,n) D-ary fixed-length codes Cn\mathcal{C}_n for the source with lim supnkn<HD(X)\limsup_{n\to\infty} \frac{k}{n}<H_D(X) we have that Pe(Cn)>1ϵ\mboxfornsufficientlylargeP_e(\mathcal{C}_n)>1-\epsilon \mbox{ for n sufficiently large} ## Conclusion HD(X)=inf{R:R\mboxachievable}H_D(X)=\inf\{R:R \mbox{ achievable}\} where R \mbox{ achievable}\iff \forall\epsilon>0, \ \exists\mathcal{C_{n}}\mbox{ s.t. } \begin{array} \ \limsup_{n\to\infty} \frac{k}{n}\le R \mbox{ and} \\ P_e<\epsilon \mbox{ for n sufficiently large}\end{array} [!rmk] For a non-empty set AA (with finite infimum), to show that a=infA\vec a=\inf A, one needs to show that 1. Forward Part: δ>0,a0A\mboxs.t.a0<infA+δ\forall\delta>0,\exists a_{0} \in A \mbox{ s.t. } a_{0}<\inf A+\delta 2. Converse Part: aA,ainfA\forall a\in A, a\ge \inf A ## Intuition The theorem shows that for a DMS, its source entropy HD(X)H_D(X) is the “smallest” compression rate for which there exists D-ary block codes for the source with asymptotically vanishing error probability (i.e. limnPe=0\lim_{n\to\infty}P_e=0).

VLC DMS

Given integer D2D\ge2, consider a DMS {Xi}i=1\{X_i\}_{i=1}^{\infty} with alphabet X\mathcal{X}, pmf pXp_X, and source entropy HD(X)=aXpX(a)logDpX(a)=(logD2)H(X)H_D(X)=-\sum\limits_{a\in\mathcal{X}}p_{X}(a)\log_{D}p_{X}(a)=(\log_{D}2)H(X)Then the following results hold: 1. Forward Part (Achievability) For any ϵ(0,1)\epsilon\in(0,1), there exists a sequence of DD-ary nn-th order prefix (hence UD) VLCs f:Xn{0,1,,D1}f:\mathcal{X}^n\to\{0,1,\cdots,D-1\}^*for the source with average code rate satisfying Rˉn<HD(X)+ϵ\bar R_n<H_{D}(X)+\epsilonfor nn sufficiently large. 2. Converse Part Every DD-ary nn-order UD VLC fn:Xn{0,1,,D1}f_{n}:\mathcal{X}^{n}\to\{0,1,\cdots,D-1\}^{*}for the source has an average code rate RˉnHD(X)\bar R_{n\ge}H_{D}(X)

For any n1n\ge1, \exists a DD-ary nn-th order prefix code f:Xn{0,1,,D1}f:\mathcal{X}^{n}\to\{0,1,\cdots,D-1\}^*for a DMS {Xi}\{X_{i}\} with alphabet X\mathcal{X} with average code rate: HD(X)Rˉn<HD(X)+1nH_{D}(X)\le\bar R_n<H_{D}(X)+ \frac{1}{n}

VLC Stationary

Given integer D2D\ge2, consider a stationary source {Xi}i=1\{X_i\}_{i=1}^{\infty} with alphabet X\mathcal{X}, pmf pXp_X, and source entropy rate HD(X)=limn1nHD(Xn)H_D(\mathcal{X})=\lim_{n\to\infty} \frac{1}{n}H_{D}(X^{n})Then the following results hold: 1. Forward Part (Achievability) For any ϵ(0,1)\epsilon\in(0,1), there exists a sequence of DD-ary nn-th order prefix (hence UD) VLCs f:Xn{0,1,,D1}f:\mathcal{X}^n\to\{0,1,\cdots,D-1\}^*for the source with average code rate satisfying Rˉn<HD(X)+ϵ\bar R_n<H_{D}(\mathcal{X})+\epsilonfor nn sufficiently large. 2. Converse Part Every DD-ary nn-order UD VLC fn:Xn{0,1,,D1}f_{n}:\mathcal{X}^{n}\to\{0,1,\cdots,D-1\}^{*}for the source has an average code rate RˉnHD(X)\bar R_{n\ge}H_{D}(\mathcal{X})

For any n1n\ge1, \exists a DD-ary nn-th order prefix code f:Xn{0,1,,D1}f:\mathcal{X}^{n}\to\{0,1,\cdots,D-1\}^{*}for a stationary source {Xi}\{X_i\} with alphabet X\mathcal{X} with avg code rate 1nHD(Xn)Rˉn<1nHD(Xn)+1n\frac{1}{n}H_{D}(X^{n})\le\bar R_{n}< \frac{1}{n}H_{D}(X^{n}) + \frac{1}{n}as nn\to\infty, RˉnH(X)\bar R_{n}\to H(\mathcal{X}).

Linked from