Definition

A stochastic process $(X_n)_{n\ge 0}$ has the Markov property if

$$ \mathbb P(X_{n+1}=j \mid X_0,X_1,\dots,X_n)=\mathbb P(X_{n+1}=j \mid X_n) $$

for all states $j$.

Key identity

For a discrete-time Markov chain with transition matrix $P$,

$$ p_{ij}=\mathbb P(X_{n+1}=j \mid X_n=i). $$

Each row of $P$ is a probability distribution on the state space $S$, so

$$ p_{ij}\in[0,1], \qquad \sum_{j\in S} p_{ij}=1. $$

Consequence

Given the present state, the future is independent of the past. One-step behavior is completely encoded by the transition matrix $P$.