#Probability #InfoTheory >[!def] Markov chain >Let be a stochastic process with a countable state space . is a discrete time Markov Chain if for any and any states
Theorem (Joint distribution of Markov chain)
Denote the probability mass function of , and the |transition matrix of a Markov Chain . Then for any , and any states ,
Definition (’th Order Markov Chain)
The source is called an ’th order Markov chain (or Markov source of memory M), where fixed integer if
Remark
If is an M’th order MC with alphabet then it can be converted into a MC with alphabet where
Policy
Metropolis Hastings Algorithm
Data Processing Inequality
State
Stationary & Ergodic Source
Stationary Distribution
Stationary
Accessible
Chapman Kolmogorov Equation
Communication
Continuous Time Markov Chain
Indistinguishable
Invariant Measure
Irreducible
Joint Markov Chain
Markov Property
Markov chain
(n-μ)-small
Convergence to Equilibrium
Dobrushin's Ergodic Coefficient
Existence of Invariant measure
Foster-Lyapunov Theorems
Harris Recurrent
Invariant probability measure
Positive Harris Recurrent
Positive Recurrent
Recurrent
Uniquely Ergodic
Stationary
Transition Kernel
Hitting Time
Occupation Time
Passage Time
Stopping Time