FIND ME ON

GitHub

LinkedIn

Information Capacity

🌱

Definition
InfoTheory

Given a DMC (X,Y,Q=[PXY])(\mathcal{X},\mathcal{Y},Q=[P_{XY}]), its Information Capacity, CC, is defined as: C=maxpXI(X;Y)C=\max_{p_{X}}I(X;Y) where maximization is over all possible input distributions pXp_{X}.

1. CC is well-defined since I(X;Y)=I(PX,PYX)I(X;Y)=I(P_{X},P_{Y|X}) is a concave and continuous function of pXp_{X} (by concavity of information measure) and the set of pXp_{X} is compact (closed and bounded) in RX\mathbb{R}^{|\mathcal{X}|}; thus I(X;Y)I(X;Y) admits a maximum. Also since 0I(X;Y)min{logX,logY},0\le I(X;Y)\le \min\{\log|\mathcal{X}|,\log|\mathcal{Y}|\},then 0Cmin{logX,logY}0\le C\le \min\{\log|\mathcal{X}|,\log|\mathcal{Y}|\} 2. Will see via Shannon’s Channel Coding Theorem for the DMC that CC is the DMC’s “operational capacity”

Linked from