FIND ME ON

GitHub

LinkedIn

Joint Entropy

🌱

Definition
InfoTheory

Let (X,Y)(X,Y) be a pair of jointly distributed discrete RVs taking values in X×Y\mathscr{X}\times\mathscr{Y} (where both alphabets are finite) with joint pmf pxy(x,y):=P(X=a,Y=b), aX,bYp_{xy}(x,y):=P(X=a,Y=b), \ a\in\mathscr{X},b\in\mathscr{Y}>Then the joint entropy of XX and YY (denoted H(X,YH(X,Y)) is given by H(X,Y)=aXbYpxy(x,y)log2pxy(x,y)=Epxy[log2pxy(x,y)] \begin{align*} H(X,Y)&=-\sum_{a\in\mathscr{X}}\sum_{b\in\mathscr{Y}}p_{xy}(x,y)\log_2p_{xy}(x,y)\\ &=E_{p_{xy}}[-\log_2p_{xy}(x,y)] \end{align*}

Let Xn:=(X1,,Xn)X^n:=(X_1,\cdots,X_n) be a random vector with each element taking values in a common alphabet X\mathscr{X} with joint pmf pXnp_{X^n} then H(Xn):=H(X1,,Hn)=H(X1)+H(X2X1)+H(X3X2,X1)++H(XnHn1,,H1)=i=1nH(XiXi1) \begin{align*} H(X^n):&=H(X_1,\cdots,H_n)\\ &=H(X_1)+H(X_2|X_1)+H(X_3|X_2,X_1)+\cdots+H(X_n|H_{n-1},\cdots,H_1)\\ &=\sum_{i=1}^nH(X_i|X^{i-1}) \end{align*} where H(XiXi1):=H(X1)H(X_i|X^{i-1}):=H(X_1) for i=1i=1

Let Xn=(X1,,Xn)X^n=(X_1,\cdots,X_n) be a random vector of nn RVs with common alphabet X\mathcal{X} with joint pmf pXnp_{X^n} then, for the joint entropy we have H(Xn)=H(X1,,Hn)i=1nH(Xi)H(X^n)=H(X_1,\cdots,H_n)\le\sum_{i=1}^nH(X_i) with equality iff XiX_i’s are independent.

For (X,Y,Z)pXYZ(X,Y,Z)\sim p_{XYZ} on X×Y×Z\mathscr{X}\times\mathscr{Y}\times\mathscr{Z}, the conditional joint entropy of XX and YY given ZZ is H(X,YZ)=EpXYZ[log2(pXYZ(X,YZ))]=zXbYcZpXYZ(a,b,c)log2(pXYZ(a,bc)) \begin{align*} H(X,Y|Z)&=E_{p_{XYZ}}[-\log_2(p_{XY|Z}(X,Y|Z))]\\ &=-\sum_{z\in\mathscr{X}}\sum_{b\in\mathscr{Y}}\sum_{c\in\mathscr{Z}}p_{XYZ}(a,b,c) \log_2(p_{XY|Z}(a,b|c)) \end{align*}

If (X,Y,Z)pXYZ(X,Y,Z)\sim p_{XYZ}, then H(X,YZ)=H(XZ)+H(YX,Z)=H(YZ)+H(XY,Z)\begin{align*}H(X,Y|Z)&=H(X|Z)+H(Y|X,Z)\\&=H(Y|Z)+H(X|Y,Z)\end{align*}

Linked from