FIND ME ON

GitHub

LinkedIn

Joint Differential Entropy of Multivariate Gaussian

🌱

Theorem
InfoTheory

If X=(X1,,Xn)TN(μ,KX)\underline X=(X_{1},\ldots,X_{n})^{T}\sim\mathcal{N}(\underline\mu,K_{\underline X}) then h(X)=h(X1,,Xn)=12log2[(2πe)ndet(KX)]h(\underline X)=h(X_{1},\ldots,X_{n})=\frac{1}{2}\log_{2}\left[(2\pi e)^{n}\det(K_{\underline X})\right]

A special case is if n=1n=1, then h(X1)=12log2[(2πe)σ12]h(X_{1})=\frac{1}{2}\log_{2}[(2\pi e)\sigma_{1}^{2}]where σ12=\mboxVar(X1)\sigma_{1}^{2}=\mbox{Var}(X_{1}).