FIND ME ON

GitHub

LinkedIn

Differential Divergence

🌱

Definition
InfoTheory

This is identical to divergence, where for RVs XX and YY with pdfs fX,fYf_{X}, f_{Y}, with supports SXSYRS_{X}\subset S_{Y}\subset \mathbb{R}. Then the divergence between XX and YY is D(XY):=EfX[log2fX(X)fY(Y)]=SXfX(t)log2fX(t)fY(t)dt\begin{align*} D(X\|Y):&=E_{f_{X}}\left[\log_{2}\frac{f_{X}(X)}{f_{Y}(Y)}\right]\\\\ &=\int_{S_{X}}f_{X}(t)\log_{2}\frac{f_{X}(t)}{f_{Y}(t)}dt \end{align*} Multivariate version: D(XnYn):=SXnfXn(t1,,tn)log2fXn(t1,,tn)fYn(t1,,tn)dt1dtnD(X^{n}\|Y^{n}):=\int_{S_{X^{n}}}f_{X^{n}}(t_{1},\cdots,t_{n})\log_{2}\frac{f_{X^{n}}(t_{1},\cdots,t_{n})}{f_{Y^{n}}(t_{1},\cdots,t_{n})}dt_{1}\cdots dt_{n}

For continuous RVs XfXX\sim f_{X} and YfYY\sim f_{Y}, with SXSYRS_{X}\subset S_{Y}\subset\mathbb{R} for nn and mm sufficiently large limnD([X]n[Y]n)=SXfX(t)log2fX(t)fY(t)dt=D(XY)\lim_{n\to\infty} D([X]_{n}\|[Y]_{n})=\int_{S_{X}}f_{X}(t)\log_{2}\frac{f_{X}(t)}{f_{Y}(t)}dt=D(X\|Y) # Conclusion Divergence is a universal information measure in Information Theory.

Linked from