FIND ME ON

GitHub

LinkedIn

Law of Large Numbers

🌱

Theorem
InfoTheoryProbability

The Weak one

If {Xi}i=1\{X_i\}_{i=1}^\infty are Independent in L2(Ω,F,P)\mathscr{L}^{2}(\Omega,\mathcal{F},\mathbb{P}) with E[Xi]=μ,i1\mathbb{E}[X_{i}]=\mu,\forall i\ge 1 and Var(Xi)V<,i1\text{Var}(X_{i})\le V<\infty,\forall i\ge 1, then 1ni=1nXinμ \mboxinprobability\frac{1}{n}\sum_{i=1}^nX_i\xrightarrow{n\to\infty}\mu \ \mbox{in probability}or P(1ni=1nXiμϵ)1,ϵ>0\mathbb{P}\left( \left| \frac{1}{n}\sum_{i=1}^{n}X_{i}-\mu \right| \ge \epsilon \right)\to1,\quad\forall\epsilon>0

\begin{proof} Sufficient to prove for μ=0\mu=0 (we can define Xi=XiμX_{i}=X_{i}'-\mu) For μ=0\mu=0, P(1ni=1nXiϵ)1ϵ2E[1ni=1nXi2]Chebyshev=1(ϵn)2i=1nE[Xi2]independence of varianceVnϵ2n2=Vϵ2n=Oϵ(1n)0\begin{align*} \mathbb{P}\left( \left| \frac{1}{n}\sum_{i=1}^{n}X_{i} \right| \ge\epsilon\right)&\le \frac{1}{\epsilon^{2}}\mathbb{E}\left[ \left| \frac{1}{n}\sum_{i=1}^{n}X_{i} \right| ^{2} \right]&\text{Chebyshev}\\ &= \frac{1}{(\epsilon n)^{2}}\sum_{i=1}^{n}\mathbb{E}[X_{i}^{2}]&\text{independence of variance}\\ &\le \frac{Vn}{\epsilon^{2}n^{2}}= \frac{V}{\epsilon^{2}n}=O_{\epsilon}\left( \frac{1}{n} \right)\to 0 \end{align*} where the first inequality is due to Chebyshev Inequality, the second is by the fact that Variance and the third is by our upper bound on the variance. \end{proof}

The Strong one

Let Xn,nNX_n, n\in \mathbb{N} be a sequence of iid rv on (Ω,F,P)(\Omega,\mathcal{F},\mathbb{P}), with XiC,i1|X_{i}|\le C,\forall i\ge 1. Let μ=E[Xi],i1\mu=E[|X_{i}|],\forall i\ge 1. Then, P(limn1ni=1nXi=μ)=1\mathbb{P}\left(\lim_{n\to\infty}\frac{1}{n}\sum_{i=1}^nX_i=\mu\right)=1

\begin{proof} Again we reduce to the case where μ=0\mu=0. Let σ2=E[Xi2]<\sigma^{2}=\mathbb{E}[X_{i}^{2}]<\infty. We first note that E[X1++Xnn2]=1n2(σ2++σ2)=σ2n\mathbb{E}\left[ \left| \frac{X_{1}+\dots+X_{n}}{n} \right| ^{2} \right]=\frac{1}{n^{2}}(\sigma^{2}+\dots+\sigma^{2})=\frac{\sigma^{2}}{n}Let Zn=X1++Xnn2Z_{n}=\left| \frac{X_{1}+\dots+X_{n}}{n} \right|^{2}, we then note that m1E[Zm2]=m1σ2m2<.\sum_{m\ge 1}\mathbb{E}[Z_{m^{2}}]=\sum_{m\ge 1} \frac{\sigma^{2}}{m^{2}}<\infty.So by Beppo Levi Theorem we have that: - m1Zm2<\sum_{m\ge 1}Z_{m^{2}}<\infty a.s. and hence - Zm20Z_{m^{2}}\to0 a.s. But, now suppose nn\to \infty and find m=m(n)m=m(n) with m2n<(m+1)2m^{2}\le n<(m+1)^{2}. Have that nm2(m+1)2m2=2m+1.n-m^{2}\le (m+1)^{2}-m^{2}=2m+1.Then, X1++Xnn=X1++Xm2+O(2m+1)nX1++Xm2m2+O(1m)0\left| \frac{X_{1}+\dots+X_{n}}{n} \right| =\left| \frac{X_{1}+\dots+X_{m^{2}}+O(2m+1)}{n} \right| \le \left| \frac{X_{1}+\dots+X_{m^{2}}}{m^{2}}+O\left( \frac{1}{m} \right) \right| \to0a.s..

\end{proof} # Intuition You can think of this as a sequence of i.i.d. RVs with finite expectation converging to the mean almost surely (or w.p.1.).

Linked from