Information Capacity

Definition (Information Capacity)

Given a DMC (X,Y,Q=[PXY])(\mathcal{X},\mathcal{Y},Q=[P_{XY}]), its Information Capacity, CC, is defined as: C=maxpXI(X;Y)C=\max_{p_{X}}I(X;Y) where maximization is over all possible input distributions pXp_{X}.

Remark

  1. CC is well-defined since I(X;Y)=I(PX,PYX)I(X;Y)=I(P_{X},P_{Y|X}) is a concave and continuous function of pXp_{X} (by concavity of information measure) and the set of pXp_{X} is compact (closed and bounded) in RX\mathbb{R}^{|\mathcal{X}|}; thus I(X;Y)I(X;Y) admits a maximum. Also since 0I(X;Y)min{logX,logY},0\le I(X;Y)\le \min\{\log|\mathcal{X}|,\log|\mathcal{Y}|\},then 0Cmin{logX,logY}0\le C\le \min\{\log|\mathcal{X}|,\log|\mathcal{Y}|\}
  2. Will see via Shannon’s Channel Coding Theorem for the DMC that CC is the DMC’s “operational capacity”

Problem

Given a DMC (X,Y,Q=[PXY])(\mathcal{X},\mathcal{Y},Q=[P_{XY}]), determine the Information Capacity C=maxpXI(X;Y)C=\max_{p_{X}}I(X;Y)First note that necessary and sufficient conditions for finding CC can be shown since I(X;Y)=I(pX,pYX)I(X;Y)=I(p_{X},p_{Y|X}) is concave in pXp_{X}; but they might be hard to use as they require “guessing” a priori the optimal input distribution.

\therefore In general, CC does not admit a closed-form expression (in terms of channel parameters), unless if the channel exhibits certain “symmetry” properties.

Linked from