FIND ME ON

GitHub

LinkedIn

Lossless Joint Source-Channel Coding Theorem

🌱

Theorem
InfoTheory

Given a discrete stationary source {Vi}i=1\{V_{i}\}^\infty_{i=1} with alphabet V\mathcal{V} and entropy rate H(V)H(\mathcal{V}), and a DMC (X,Y,PYX)(\mathcal{X},\mathcal{Y},P_{Y|X}) with capacity CC, where both H(V)H(\mathcal{V}) and CC are measured in the same units, then we have: 1. Forward Part (Achievability): For any 0<ϵ<10<\epsilon<1 and given that the stationary source is also ergodic (i.e. it is stationary ergodic), if H(V)<CH(\mathcal{V})<Cthen \exists sequence of rate-one source-channel codes {Cm,m}m=1\{\mathcal{C}_{m,m}\}^\infty_{m=1} with error probability Pe(Cm,m)P_{e}(\mathcal{C}_{m,m}) satisfying Pe(Cm,m)<ϵP_{e}(\mathcal{C}_{m,m})<\epsilonfor mm sufficiently large. 2. Converse Part: If H(V)>CH(\mathcal{V})>Cthen any sequence of rate-one source-channel codes {Cm,m}m=1\{\mathcal{C}_{m,m}\}^\infty_{m=1} for the source and the channel has error probability Pe(Cm,m)P_{e}(\mathcal{C}_{m,m}) satisfying lim infmPe(Cm,m)>0\liminf_{m\to\infty}P_{e}(\mathcal{C}_{m,m})>0

Given a discrete stationary source {Vi}i=1\{V_{i}\}^\infty_{i=1} with alphabet V\mathcal{V} and entropy rate H(V)H(\mathcal{V}), and a DMC (X,Y,PYX)(\mathcal{X},\mathcal{Y},P_{Y|X}) with capacity CC, where both H(V)H(\mathcal{V}) and CC are measured in the same units, then we have: 1. Forward Part (Achievability): For any 0<ϵ<10<\epsilon<1 and given that the stationary source is also ergodic (i.e. it is stationary ergodic), if lim supmmnmH(V)<C\limsup_{m\to\infty} \frac{m}{n_{m}} H(\mathcal{V})<Cthen \exists sequence of mm-to-nmn_{m} source-channel codes {Cm,nm}m=1\{\mathcal{C}_{m,n_{m}}\}^\infty_{m=1} with error probability Pe(Cm,nm)P_{e}(\mathcal{C}_{m,n_{m}}) satisfying Pe(Cm,nm)<ϵP_{e}(\mathcal{C}_{m,n_{m}})<\epsilonfor mm sufficiently large. 2. Converse Part: If lim supmmnmH(V)>C\limsup_{m\to\infty} \frac{m}{n_{m}} H(\mathcal{V})>Cthen any sequence of mm-to-nmn_{m} source-channel codes {Cm,nm}m=1\{\mathcal{C}_{m,n_{m}}\}^\infty_{m=1} for the source and the channel has error probability Pe(Cm,nm)P_{e}(\mathcal{C}_{m,n_{m}}) satisfying lim infmPe(Cm,nm)>0\liminf_{m\to\infty}P_{e}(\mathcal{C}_{m,n_{m}})>0