FIND ME ON

GitHub

LinkedIn

Differential Cross-Entropy

🌱

Definition
InfoTheory

X∼fX,Y∼fYX\sim f_{X},Y\sim f_{Y} with SXāŠ‚SYāŠ‚RS_{X}\subset S_{Y}\subset \mathbb{R}, the differential cross-entropy between fXf_{X} and fYf_{Y} is h(fX;fY)=∫SXfX(t)log⁔21fY(t)dth(f_{X};f_{Y})=\int_{S_{X}}f_{X}(t)\log_{2} \frac{1}{f_{Y}(t)}dt

D(fX∄fY)=āˆ’h(X)+h(fX;fY)≄0D(f_{X}\|f_{Y})=-h(X)+h(f_{X};f_{Y})\ge0∓\therefore h(fX;fY)≄h(X)h(f_{X};f_{Y})\ge h(X)