FIND ME ON

GitHub

LinkedIn

Renyi Divergence

🌱

Definition
InfoTheory

Given two pmfs pp and qq with common support X\mathscr{X}, and parameter α>0, α1\alpha>0, \ \alpha\not=1, the Renyi divergence of parameter α\alpha between pp and qq is: Dα:=1α1log2(aXpα(a)q1α(a))D_\alpha:=\frac{1}{\alpha-1}\log_2\left(\sum_{a\in\mathscr{X}}p^\alpha(a)q^{1-\alpha}(a)\right)

limα1Dα(pq)=D(pq)\lim_{\alpha\to1}D_\alpha(p\|q)=D(p\|q)