State

Definition (State)

For a Markov chain {Xi}i=1\{X_i\}_{i=1}^{\infty}, XiX_i is called the “state” of the Markov chain at time ii.

Linked from