FIND ME ON

GitHub

LinkedIn

Kolmogorov 0-1 Law

🌱

Theorem
StochasticDiffsStochasticProcesses

Events

For Independent (An)nN(A_{n})_{n\in \mathbb{N}} with tail field τ\tau, if AτA\in\tau then P(A){0,1}\mathbb{P}(A)\in \{ 0,1 \}

\begin{proof} (Of ) To prove this we need the following lemma:

>[!lemma|3.5.2] >For B,(Bn)nNB,(B_{n})_{n\in \mathbb{N}} Independent, any Sσ(B1,B2,)S\in\sigma(B_{1},B_{2},\dots) is independent of BB.
\begin{proof} (Of ) To show independence we must show P(SB)=P(S)P(B)\mathbb{P}(S\cap B)=\mathbb{P}(S)\mathbb{P}(B)If P(B)=0\mathbb{P}(B)=0 then the proof is trivial. Hence, let P(B)>0\mathbb{P}(B)>0. Then we define the semialgebra J={Di1Din:Di=Bi or Bic,n1}{,Ω}\mathcal{J}=\{ D_{i_{1}}\cap\dots \cap D_{i_{n}}:D_{i_{\ell}}=B_{i_{\ell}}\text{ or }B_{i_{\ell}}^{c},\forall n\ge 1 \}\cup \{ \emptyset,\Omega \} Then, for AJA\in \mathcal{J} we have by independence, P(A)=P(BA)P(B)=P(AB)\mathbb{P}(A)= \frac{\mathbb{P}(B\cap A)}{\mathbb{P}(B)}=\mathbb{P}(A|B)Now, we define a new probability measure Q:σ(B1,B2,)[0,1]\mathbb{Q}:\sigma(B_{1},B_{2},\dots)\to[0,1] by Q(S)=P(SB)P(B),for Sσ(B1,B2,)\mathbb{Q}(S)= \frac{\mathbb{P}(S\cap B)}{\mathbb{P}(B)},\quad\text{for }S\in\sigma(B_{1},B_{2},\dots) Then, Q\mathbb{Q} is a prob measure since Q()=0,Q(Ω)=1\mathbb{Q}(\emptyset)=0,\mathbb{Q}(\Omega)=1 and σ\sigma-additivity holds since P\mathbb{P} is a prob measure. But, we see that Q\mathbb{Q} and P\mathbb{P} agree on J\mathcal{J} and since Jσ(B1,B2,)\mathcal{J}\subseteq\sigma(B_{1},B_{2},\dots) then by Extension Theorem, they agree on σ(J)=σ(B1,B2,)\sigma(\mathcal{J})=\sigma(B_{1},B_{2},\dots). Thus, P(S)=Q(S)=P(SB)P(B),Sσ(B1,B2,)\mathbb{P}(S)=\mathbb{Q}(S)= \frac{\mathbb{P}(S\cap B)}{\mathbb{P}(B)},\quad\forall S\in\sigma(B_{1},B_{2},\dots) \end{proof}

Applying this lemma twice we get the following corollary:

>[!corollary|3.5.3] >Let (An)nN,(Bn)nN(A_{n})_{n\in \mathbb{N}},(B_{n})_{n\in \mathbb{N}} be sequences of mutually Independent. If Sσ(A1,A2,)S\in\sigma(A_{1},A_{2},\dots) then S,B1,B2,S,B_{1},B_{2},\dots are independent.
\begin{proof} (Of ) For i1,,ini_{1},\dots,i_{n}, let B=Bi1BinB=B_{i_{1}}\cap\dots \cap B_{i_{n}}. Note B,A1,A2,B,A_{1},A_{2},\dots are independent. Thus Sσ(A1,A2,)    P(BS)=P(B)P(S)    S,B1,B2, are independent.S\in\sigma(A_{1},A_{2},\dots)\implies \mathbb{P}(B\cap S)=\mathbb{P}(B)\mathbb{P}(S)\implies S,B_{1},B_{2},\dots \text{ are independent.}Where the first implication is by . \end{proof}

Now we can finish proving the theorem. If for any nn, Aσ(An,An+1,)A\in\sigma(A_{n},A_{n+1},\dots) then gives us that A,A1,A2,,An1 are independentA,A_{1},A_{2},\dots,A_{n-1}\text{ are independent}since this holds n\forall n we have A,A1,A2, are independent.A,A_{1},A_{2},\dots \text{ are independent}. Then, the gives us that For Sσ(A1,A2,), we have S ⁣ ⁣ ⁣A.\text{For }S\in\sigma(A_{1},A_{2},\dots),\text{ we have }S\perp\!\!\!\perp A.But, Aτσ(A1,A2,)A\in \tau \subseteq\sigma(A_{1},A_{2},\dots), so Aσ(A1,A2,)A\in\sigma(A_{1},A_{2},\dots) and by the logic above we have: A ⁣ ⁣ ⁣A    P(A)=P(AA)=P(A)2    P(A)=0 or P(A)=1A\perp\!\!\!\perp A\implies \mathbb{P}(A)=\mathbb{P}(A\cap A)=\mathbb{P}(A)^{2}\implies \mathbb{P}(A)=0\text{ or }\mathbb{P}(A)=1 \end{proof} # Random Variables >[!thm] Kolmogorov 0-1 Law (rvs) >Let (Xn)nN(X_{n})_{n\in\mathbb{N}} be Independent. Let Fn=σ(X0,,Xn)\mathcal{F}_{n}=\sigma(X_{0},\dots,X_{n}), nN\forall n\in\mathbb{N} and let Gn=σ(Xn+1,Xn+2,)\mathcal{G}_{n}=\sigma(X_{n+1},X_{n+2},\dots), nN\forall n\in\mathbb{N} (GnGn+1\mathcal{G}_{n}\supset\mathcal{G}_{n+1}\supset\dots). Let G=nNGn\mathcal{G}=\bigcap_{n\in\mathbb{N}}\mathcal{G}_{n}where G\mathcal{G} is called the σ-algebra of tail events. We have that P(A){0,1}, AGP(A)\in\{ 0,1 \}, \ \forall A\in\mathcal{G}

Linked from