Pairwise Independence (For 2 events) Let (Ī©,F,P) be a probability space. Let G1ā,G2āāF. G1ā,G2ā are independent if āA1āāG1ā,āA2āāG2ā:Ā P(A1āā©A2ā)=P(A1ā)P(A2ā)Joint Independence (For n events) Let A1ā,...,AnāāF. A1ā,...,Anā are independent if for any 2ā¤kā¤n and 1ā¤i1ā<ā¦<ikāā¤n, P(Ai1āāā©ā¦ā©Aikāā)=P(Ai1āā)ā¦P(Aikāā) Independence for countable events Let A1ā,A2ā,⦠be a countable sequence of events. We say theyāre independent if P(Ai1āāā©āÆā©Aināā)=P(Ai1āā)ā¦P(Aināā),ā{i1ā,ā¦,inā}āN
Pairwise Independence (For 2 rvs) For X,Y rvs, Xā„ā„Y means P(XāS1ā,YāS1ā)=P(XāS1ā)P(YāS2ā),āS1ā,S2āāB(R) Joint Independence (For n rvs) Let X1ā,āÆ,Xnā be RVs. They are independent if for any A1ā,āÆ,AnāāB(R) P(X1āāA1ā,āÆ,XnāāAnā)=P(X1āāA1ā)āÆP(XnāāAnā)
rvs X,Y are independent if and only if E[f(X)g(Y)]=E[f(X)]E[g(Y)]for all bounded Borel functions f,g:RāR.
\begin{proof} Sufficiency P(XāA,YāB)=E[1Aā(X)1Bā(Y)]=E[1Aā(X)]E[1Bā(Y)]=P(XāA)P(YāB) Necessity If Xā„ā„Yā¹E[1Aā(X)1Bā(Y)]=E[1Aā(X)]E[1Bā(Y)]. Then, taking linear combinations for simple f,g then E[f(X)g(Y)]=E[f(X)]E[g(Y)]Then, if f,g not simple suppose f,gā„0 and find simple functions that approximate these and get the same result.
If f,g are signed then split them into positive and negative parts and we get the same thing again.
\end{proof}
If (Xiā)i=1nā are and fiā(Xiā)āL1,1ā¤iā¤n then āi=1nāfiā(Xiā)āL1 and E[i=1ānāfiā(Xiā)]=i=1ānāE[fiā(Xiā)]
Note
Pairwise independence does NOT imply joint independence.
Any collection of random variables X1ā,āÆ,Xnā are independent if and only if pX1ā,āÆ,Xnāā(x1ā,āÆ,xnā)=p1ā(x1ā)āÆpnā(xnā)