Definition (Independent events)
Pairwise Independence (For 2 events) Let (Ω,F,P) be a probability space. Let G1,G2⊂F. G1,G2 are independent if ∀A1∈G1,∀A2∈G2: P(A1∩A2)=P(A1)P(A2)Joint Independence (For n events) Let A1,...,An∈F. A1,...,An are independent if for any 2≤k≤n and 1≤i1<…<ik≤n, P(Ai1∩…∩Aik)=P(Ai1)…P(Aik) Independence for countable events Let A1,A2,… be a countable sequence of events. We say they’re independent if P(Ai1∩⋯∩Ain)=P(Ai1)…P(Ain),∀{i1,…,in}⊂N
Definition (Independent random variables)
Pairwise Independence (For 2 rvs) For X,Y rvs, X⊥⊥Y means P(X∈S1,Y∈S1)=P(X∈S1)P(Y∈S2),∀S1,S2∈B(R) Joint Independence (For n rvs) Let X1,⋯,Xn be RVs. They are independent if for any A1,⋯,An∈B(R) P(X1∈A1,⋯,Xn∈An)=P(X1∈A1)⋯P(Xn∈An)
Theorem
rvs X,Y are independent if and only if E[f(X)g(Y)]=E[f(X)]E[g(Y)]for all bounded Borel Measurable Functions f,g:R→R.
\begin{proof} Sufficiency P(X∈A,Y∈B)=E[1A(X)1B(Y)]=E[1A(X)]E[1B(Y)]=P(X∈A)P(Y∈B) Necessity If X⊥⊥Y⟹E[1A(X)1B(Y)]=E[1A(X)]E[1B(Y)]. Then, taking linear combinations for simple f,g then E[f(X)g(Y)]=E[f(X)]E[g(Y)]Then, if f,g not simple suppose f,g≥0 and find simple functions that approximate these and get the same result.
If f,g are signed then split them into positive and negative parts and we get the same thing again.
\end{proof}
Proposition
If (Xi)i=1n are and fi(Xi)∈L1,1≤i≤n then ∏i=1nfi(Xi)∈L1 and E[i=1∏nfi(Xi)]=i=1∏nE[fi(Xi)]
Note
Pairwise independence does NOT imply joint independence.
Theorem
Any collection of random variables X1,⋯,Xn are independent if and only if pX1,⋯,Xn(x1,⋯,xn)=p1(x1)⋯pn(xn)