Definition

For a discrete-time Markov chain on state space $S$, the transition matrix is

$$ P=(p_{ij})_{i,j\in S} $$

where

$$ p_{ij}=P(X_{n+1}=j\mid X_n=i) $$

for all $i,j\in S$.

Properties

Each entry satisfies

$$ p_{ij}\ge 0 $$

and each row sums to $1$:

$$ \sum_{j\in S} p_{ij}=1 $$

Interpretation

Row $i$ describes the one-step transition probabilities from state $i$ to all possible next states.