FIND ME ON

GitHub

LinkedIn

Shannon's Channel Coding Theorem for the DMC

🌱

Theorem
InfoTheory

For DMC (X,Y,Q=[Pij])(\mathcal{X},\mathcal{Y},Q=[P_{ij}]) with information capacity C=maxpXI(X;Y)C=\max_{p_{X}}I(X;Y)its operational capacity CopC_{op} satisfies Cop=CC_{op}=C i.e. the following two results hold: 1. Forward Part (Achievability): For any 0<ϵ<10<\epsilon<1, there exists γ=γ(ϵ)>0\gamma=\gamma(\epsilon)>0 and a sequence of (n,Mn)(n,M_{n}) fixed-length codes Cn\mathcal{C}_{n} for the DMC with C>lim infn1nlog2MnCγC>\liminf_{n\to\infty} \frac{1}{n}\log_{2}M_{n}\ge C-\gammaand Pe(Cn)<ϵP_{e}(\mathcal{C}_{n})<\epsilonfor nn sufficiently large. 2. Converse Part: For any sequence of (n,Mn)(n,M_{n}) fixed-length codes Cn\mathcal{C}_n for the DMC with lim infn1nlog2Mn>C\liminf_{n\to\infty} \frac{1}{n}\log_{2}M_{n}> C we have that limnPe(Cn)>0\lim_{n\to\infty}P_{e}(\mathcal{C}_{n})>0or that Pe(Cn)P_{e}(\mathcal{C}_n) cannot asymptotically vanish (i.e. if our rate exceeds the channel capacity, we do not get low error of probability).

Given a DMC (X,Y,Q=[Pij])(\mathcal{X},\mathcal{Y},Q=[P_{ij}]) with arbitrary input word XnX^n resulting in output word YnY^n, then I(Xn;Yn)nCI(X^{n};Y^{n})\le nCwhere C:=maxpXI(X;Y)C:=\max_{p_{X}}I(X;Y) is the channel’s information capacity.

Linked from