FIND ME ON

GitHub

LinkedIn

Renyi Entropy

🌱

Definition
InfoTheory

Given parameter α>0, α1\alpha>0, \ \alpha\not=1, a RV XpXX\sim p_X, then the Renyi entropy with parameter α\alpha is Hα(X)=11αlog2(aXpX(a)α)=α1αlog2(pXα)\begin{align*} H_\alpha(X)&=\frac{1}{1-\alpha}\log_2\left(\sum_{a\in\mathscr{X}}p_X(a)^\alpha\right)\\ &=\frac{\alpha}{1-\alpha}\log_2\left(\|p_X\|_\alpha\right) \end{align*} where pXα\|p_X\|_\alpha is the “α\alpha-norm of pXp_X” or pXα=[aXpX(a)α]1α\|p_X\|_\alpha=\left[\sum_{a\in\mathscr{X}}p_X(a)^\alpha\right]^{\frac{1}{\alpha}}

limα1Hα(X)=H(X)\lim_{\alpha\to1}H_\alpha(X)=H(X)

Linked from