Convergence Theorem Definition

For a Markov chain on state space $S$, if the chain is simultaneously

  • irreducible

  • aperiodic

  • positive recurrent

    then the $n$-step transition probabilities converge to the unique stationary distribution $\pi$, and the limit is independent of the initial state $i$:

    $$ \lim_{n \to \infty} p_{ij}^{(n)} = \pi_j, \quad \forall i,j \in S. $$

A Markov chain satisfying all three conditions is called ergodic.


Each condition has a distinct role:

  • Positive recurrence guarantees the existence of a normalizable stationary distribution $\pi$ with finite total mass. It rules out the null recurrent case, where transition probabilities may converge to $0$.

  • Irreducibility guarantees that the stationary distribution $\pi$ is globally unique, so the limit $\pi_j$ does not depend on the starting state $i$.

  • Aperiodicity guarantees that the sequence $\{p_{ij}^{(n)}\}$ has an actual limit, rather than oscillating forever across periodic subclasses.


Example

Let

$$ P= \begin{bmatrix} 0.7 & 0.3 \\ 0.4 & 0.6 \end{bmatrix}. $$

Since all entries are strictly positive, the chain is finite, irreducible, and aperiodic. Hence it satisfies the convergence theorem.

Solving

$$ \pi P = \pi $$

together with

$$ \sum_i \pi_i = 1 $$

gives

$$ \pi = \left[\frac{4}{7}, \frac{3}{7}\right]. $$

Therefore, $$ \lim_{n\to\infty} P^n

\begin{bmatrix} 4/7 & 3/7 \ 4/7 & 3/7 \end{bmatrix}. $$

Each row of the limiting matrix equals the stationary distribution $\pi$, which algebraically shows independence from the initial state.