Events
For Independent (An)n∈N with tail field τ, if A∈τ then P(A)∈{0,1}
\begin{proof} (Of ) To prove this we need the following lemma:
| >[!lemma|3.5.2] >For B,(Bn)n∈N Independent, any S∈σ(B1,B2,…) is independent of B. |
\begin{proof} (Of ) To show independence we must show P(S∩B)=P(S)P(B)If P(B)=0 then the proof is trivial. Hence, let P(B)>0. Then we define the semialgebra J={Di1∩⋯∩Din:Diℓ=Biℓ or Biℓc,∀n≥1}∪{∅,Ω} Then, for A∈J we have by independence, P(A)=P(B)P(B∩A)=P(A∣B)Now, we define a new probability measure Q:σ(B1,B2,…)→[0,1] by Q(S)=P(B)P(S∩B),for S∈σ(B1,B2,…) Then, Q is a prob measure since Q(∅)=0,Q(Ω)=1 and σ-additivity holds since P is a prob measure. But, we see that Q and P agree on J and since J⊆σ(B1,B2,…) then by Extension Theorem, they agree on σ(J)=σ(B1,B2,…). Thus, P(S)=Q(S)=P(B)P(S∩B),∀S∈σ(B1,B2,…) \end{proof} |
Applying this lemma twice we get the following corollary:
| >[!corollary|3.5.3] >Let (An)n∈N,(Bn)n∈N be sequences of mutually Independent. If S∈σ(A1,A2,…) then S,B1,B2,… are independent. |
\begin{proof} (Of ) For i1,…,in, let B=Bi1∩⋯∩Bin. Note B,A1,A2,… are independent. Thus S∈σ(A1,A2,…)⟹P(B∩S)=P(B)P(S)⟹S,B1,B2,… are independent.Where the first implication is by . \end{proof} |
Now we can finish proving the theorem. If for any n, A∈σ(An,An+1,…) then gives us that A,A1,A2,…,An−1 are independentsince this holds ∀n we have A,A1,A2,… are independent. Then, the gives us that For S∈σ(A1,A2,…), we have S⊥⊥A.But, A∈τ⊆σ(A1,A2,…), so A∈σ(A1,A2,…) and by the logic above we have: A⊥⊥A⟹P(A)=P(A∩A)=P(A)2⟹P(A)=0 or P(A)=1 \end{proof} # Random Variables >[!thm] Kolmogorov 0-1 Law (rvs) >Let (Xn)n∈N be Independent. Let Fn=σ(X0,…,Xn), ∀n∈N and let Gn=σ(Xn+1,Xn+2,…), ∀n∈N (Gn⊃Gn+1⊃…). Let G=n∈N⋂Gnwhere G is called the σ-algebra of tail events. We have that P(A)∈{0,1}, ∀A∈G