FIND ME ON

GitHub

LinkedIn

Shannon's Channel Coding Theorem for AWGN

🌱

Theorem
InfoTheory

For the AWGN Channel with input power PP and noise power σ2\sigma^{2}, and information capacity C(P)=12log2(1+Pσ2)C(P)=\frac{1}{2}\log_{2}\left(1+ \frac{P}{\sigma^{2}}\right), its operational capacity satisfies: Cop(P)=C(P)=12log2(1+Pσ2)C_{op}(P)=C(P)=\frac{1}{2}\log_{2}\left(1+ \frac{P}{\sigma^{2}}\right)In other words: Forward Part (Achievability) For any 0<ϵ<10<\epsilon<1, there exist 0<γ<2ϵ0<\gamma<2\epsilon and a sequence of (n,Mn)(n,M_{n}) block codes Cn\mathcal{C}_{n} for the channel with 1nlog2Mn>C(P)γ\frac{1}{n}\log_{2}M_{n}>C(P)-\gammaand each codeword cm=(cm1,,cmn)c_{m}=(c_{m1},\ldots,c_{mn}) in Cn\mathcal{C}_{n} satisfying 1ni=1ncmi2P()\tag{$*$}\frac{1}{n}\sum\limits_{i=1}^{n}c_{mi}^{2}\le Psuch that Pe(Cn)<ϵP_{e}(\mathcal{C}_{n})<\epsilon for nn sufficiently large. Converse For any sequence of (n,Mn)(n,M_{n}) block codes Cn\mathcal{C}_{n} for the channel whose codewords satisfy *, we have that if lim infn1nlog2Mn>C(P)\liminf_{n\to\infty} \frac{1}{n}\log_{2}M_{n}>C(P)then lim infnPe(Cn)>0\liminf_{n\to\infty}P_{e}(\mathcal{C}_{n})>0i.e. for any rate exceeding channel capacity, our probability decoding error is always non-zero.