FIND ME ON

GitHub

LinkedIn

Block Codes for the Gaussian Channel

🌱

Definition
InfoTheory

Given positive integers nn and M=MnM=M_{n}, a block (fixed-length) code Cn=(n,M)\mathcal{C}_{n}=(n,M) for a AWGN Channel with - Average Power Constraint: PP - Codeword Length: nn - Number of Codewords: MM - Rate: 1nlog2M\frac{1}{n}\log_{2}M (bits per channel symbol) - Message Set: M={1,2,,M}\mathcal{M}=\{1,2,\dots,M\} of MM information messages intended for transmission (note: messages are uniformly distributed over M\mathcal{M}) - Encoding Function: f:MRnf:\mathcal{M}\to\mathbb{R}^{n}yielding codewords c1=f(1),,cM=f(M)c_{1}=f(1),\ldots,c_{M}=f(M) such that each codeword cm=(cm1,,cmn)c_{m}=(c_{m_{1}},\ldots,c_{m_{n}}) of length nn satisfies the average power constraint PP 1ni=1ncmi2P, m=1,,M\frac{1}{n}\sum\limits_{i=1}^{n}c_{m_{i}}^{2}\le P, \ m=1,\ldots,M - Decoding function g:RnMg:\mathbb{R}^{n}\to\mathcal{M}

Pe(Cn)=1Mm=1Mλm(Cn)P_{e}(\mathcal{C}_{n})= \frac{1}{M}\sum\limits_{m=1}^{M}\lambda_{m}(\mathcal{C}_{n})where λm(Cn)=P(g(Yn)mXn=cm)=ynRn: g(yn)mfYnXn(yncm)dyn\begin{align*} \lambda_{m}(\mathcal{C_{n}})&=P(g(Y^{n})\not=m|X^{n}=c_{m})\\ &=\int_{y^{n}\in\mathbb{R}^{n}: \ g(y^{n})\not=m}f_{Y^n|X^{n}}(y^{n}|c_{m})dy^{n} \end{align*}is the conditional error probability given that message mMm\in\mathcal{M} is sent over the channel (via codeword cm=f(m)Rnc_{m}=f(m)\in\mathbb{R}^{n}).

Linked from