For an irreducible Markov chain, the following statements are equivalent:

  1. The chain is positive recurrent:

    $$ \mathbb{E}_i[T_i^+] < \infty. $$
  2. There exists a unique stationary distribution $\pi$.

  3. For every state $i \in S$, the stationary distribution and the expected return time satisfy

    $$ \pi_i = \frac{1}{\mathbb{E}_i[T_i^+]}. $$

The assumption of irreducibility is essential.

If the chain is reducible, it may contain multiple disjoint recurrent classes. In that case, the stationary distribution is a convex combination of the stationary distributions on those classes, so it is generally not unique.

Then the quantity $\pi_i$ is not fixed, and the relation

$$ \mathbb{E}_i[T_i^+] = \frac{1}{\pi_i} $$

does not define a unique global stationary distribution.

If the chain is irreducible, the whole state space forms a single communicating class. This guarantees uniqueness: once a normalized measure $\pi$ satisfies

$$ \pi P = \pi, $$

it is the unique global stationary distribution.