Created by Knut M. Synstadfrom the Noun Project

Markov chain

#Probability #InfoTheory >[!def] Markov chain >Let Xn,nZ+X_n, n\in\mathbb{Z^+} be a stochastic process with a countable state space SS. Xn,n0X_n,n\ge0 is a discrete time Markov Chain if for any n0n\ge0 and any states i0,,in,in+1Si_0,\cdots,i_n,i_{n+1}\in S P(Xn+1=in+1Xo=io,,Xn=in)=P(Xn+1=in+1Xn=in)P(X_{n+1}=i_{n+1}|X_o=i_o,\cdots,X_n=i_n)=P(X_{n+1}=i_{n+1}|X_n=i_n)

Theorem (Joint distribution of Markov chain)

Denote λ\lambda the probability mass function of X0X_0, and PP the |transition matrix of a Markov Chain Xn,n0X_n,n\ge 0. Then for any N0N\ge0, and any states i0,,iNSi_0,\cdots,i_N\in S, P(X0=i0,,XN=iN)=λi0pi0i1piN1iNP(X_0=i_0,\cdots,X_N=i_N)=\lambda_{i_0}p_{i_0i_1}\cdots p_{i_{N-1}i_N}

Definition (MM’th Order Markov Chain)

The source {Xi}i=1\{X_i\}_{i=1}^{\infty} is called an MM’th order Markov chain (or Markov source of memory M), where M1M\ge1 fixed integer if P(Xi=aiXi1=ai1)=P(Xi=aiXi1=ai1,,XiM=aiM) i>M, aiXiP(X_i=a_{i}|X^{i-1}=a^{i-1})=P(X_{i}=a_{i}|X_{i-1}=a_{i-1},\cdots,X_{iM}=a_{iM}) \ \forall i>M, \ a^{i}\in \mathcal{X}^{i}

Remark

If {Xi}i=1\{X_i\}_{i=1}^{\infty} is an M’th order MC with alphabet X\mathcal{X} then it can be converted into a MC {Zi}i=1\{Z_i\}_{i=1}^{\infty} with alphabet XM\mathcal{X}^M where Zi:=(Xi,Xi+1,,Xi+M1), i1Z_i:=(X_{i},X_{i+1},\cdots,X_{i+M-1}), \ i\ge1

Linked from