Definition (Harris recurrent)
A set is said to be Harris Recurrent if it is visited infinitely often Almost surely, i.e.
Definition (Harris recurrent chain)
A Markov chain is called Harris recurrent if it is μ-irreducible and every set is .
Theorem (Equivalent Definition of Harris Recurrence)
i.e. our Markov chain is Harris Recurrent if starting from state , with probability 1, it will return to state .