FIND ME ON

GitHub

LinkedIn

Multivariate Gaussian

🌱

Definition
InfoTheory

The column random vector X=(X1,,Xn)T\underline X=(X_{1},\cdots,X_{n})^{T} with mean vector μ=(μ1,,μn)T\underline\mu = (\mu_{1},\cdots,\mu_{n})^{T}, where μi=E[Xi], i=1,,n\mu_{i}=E[X_{i}], \ i=1,\cdots,n, and covariance matrix KXK_{\underline X} (assumed to be invertible) given by KX=E[(Xμ)(Xμ)T]\begin{align*} K_{\underline X}&=E[(\underline X-\underline\mu)(\underline X-\underline\mu)^{T}] \end{align*}where the covariance\mboxCov(Xi,Xj)=E[(xiμi)(xjμj)T], i=1,n\mbox{Cov}(X_{i},X_{j})=E[(x_{i}-\mu_{i})(x_{j}-\mu_{j})^{T}], \ i=1,\cdots nis Gaussian if its joint pdf is given by: fX(x)=1(2π)ndet(KX)e12(xμ)TKX1(xμ), x=(x1,,xn)TRnf_{\underline X}(\underline x)=\frac{1}{\left(\sqrt{2\pi}\right)^{n}\sqrt{\det(K_{\underline X})}}e^{-\frac{1}{2}(\underline x-\underline \mu)^{T}K_{\underline X}^{-1}(\underline x-\underline\mu)}, \ \underline x=(x_{1},\cdots,x_{n})^{T}\in\mathbb{R}^{n} ## Notation XN(μ,KX)\underline X\sim \mathcal{N}(\underline\mu,K_{\underline X})

Linked from