FIND ME ON

GitHub

LinkedIn

Jensen-Shannon Divergence

🌱

Definition
InfoTheory

Given two pmfs pp and qq on the same alphabet X\mathscr{X}, the Jensen-Shannon Divergence is defined as DJS(pq):=12D(pM)+12D(qM)D_{JS}(p\|q):=\frac{1}{2}D(p\|\mathcal{M})+\frac{1}{2}D(q\|\mathcal{M}) where M(a):=p(a)+q(a)2\mathcal{M}(a):=\frac{p(a)+q(a)}{2} # Intuition This is essentially a symmetric version of divergence hence it’s more widely applicable and does not place one distribution as the focal point or reference point.

The Jensen-Shannon Divergence satisfies the nonnegative and symmetric properties of a distance but not the triangular inequality (it’s square root does).