Given a discrete stationary source {Vi}i=1∞ with alphabet V and entropy rate H(V), and a DMC (X,Y,PY∣X) with capacity C, where both H(V) and C are measured in the same units, then we have: 1. Forward Part (Achievability): For any 0<ϵ<1 and given that the stationary source is also ergodic (i.e. it is stationary ergodic), if H(V)<Cthen ∃ sequence of rate-one source-channel codes {Cm,m}m=1∞ with error probability Pe(Cm,m) satisfying Pe(Cm,m)<ϵfor m sufficiently large. 2. Converse Part: If H(V)>Cthen any sequence of rate-one source-channel codes {Cm,m}m=1∞ for the source and the channel has error probability Pe(Cm,m) satisfying m→∞liminfPe(Cm,m)>0
Given a discrete stationary source {Vi}i=1∞ with alphabet V and entropy rate H(V), and a DMC (X,Y,PY∣X) with capacity C, where both H(V) and C are measured in the same units, then we have: 1. Forward Part (Achievability): For any 0<ϵ<1 and given that the stationary source is also ergodic (i.e. it is stationary ergodic), if m→∞limsupnmmH(V)<Cthen ∃ sequence of m-to-nm source-channel codes {Cm,nm}m=1∞ with error probability Pe(Cm,nm) satisfying Pe(Cm,nm)<ϵfor m sufficiently large. 2. Converse Part: If m→∞limsupnmmH(V)>Cthen any sequence of m-to-nm source-channel codes {Cm,nm}m=1∞ for the source and the channel has error probability Pe(Cm,nm) satisfying m→∞liminfPe(Cm,nm)>0