Definition (Transition kernel)
Given a Stationary Markov chain, the transition kernel, is the collection of probabilities of all possible state transitions where is the probability of transitioning into state when in state . is a stochastic matrix is the sense that
Controlled Markov Chain
Markov Decision Process
Partially Observable Markov Decision Process
Maximal Likelihood Estimator
Metropolis Hastings Algorithm
Discrete Communication Channel
Discrete Memoryless Channel
Stationary Distribution
Accessible
Communication
Detailed Balance
Invariant Distribution
Invariant Measure
Irreducible
Joint Markov Chain
Markov chain
Convergence to Equilibrium
Dobrushin's Ergodic Coefficient
Hitting Time