Definition (μ-irreducible)
A Markov chain is -irreducible if for any with positive measure and any , s.t. i.e. there exists some finite path to any state with any initial state with positive probability.
Remark
Lebesgue-irreducible
Definition (Irreducible (474))
A Markov chain is called irreducible if one can go from any state value in to any other state value in in a finite number of transitions with positive probability, i.e., or “you can reach any other state from every state (no closed loops)”.
Definition (Irreducible (455))
We say a transition matrix is irreducible if for any , That is, is a single communicating class. We will also say that the Markov chain is irreducible.
Definition (Maximal Irreducible Measure)
A measure is a maximal μ-irreducible measure if s.t. is μ-irreducible then (i.e. is absolutely continuous w.r.t. ) or
Maximal Likelihood Estimator
Metropolis Hastings Algorithm
Stationary & Ergodic Source
Stationary Distribution
Aperiodic
Ergodic Theorem
Invariant Distribution for CTMC
Invariant Measure
Irreducible
Joint Markov Chain
(n-μ)-small
Convergence to Equilibrium
Existence of Invariant measure
Foster-Lyapunov Theorems
Harris Recurrent
Invariant Distribution ↔ Positive Recurrence
Invariant probability measure
Kac's Lemma
Positive Harris Recurrent