Definition

Let $(X_n)_{n\ge 0}$ be a real-valued stochastic process with respect to a filtration $(\mathcal F_n)_{n\ge 0}$. Then $(X_n)$ is a martingale if:

  1. $X_n$ is $\mathcal F_n$-measurable for every $n$.
  2. $\mathbb E[|X_n|]<\infty$ for every $n$.
  3. For every $n$,
$$ \mathbb E[X_{n+1}\mid \mathcal F_n]=X_n. $$

If $\mathcal F_n=\sigma(X_0,\dots,X_n)$ is the natural filtration, then the measurability condition is automatic.

Interpretation

A martingale is a fair game: given all information up to time $n$, the conditional mean of the next value is exactly the present value.

Lemma

If $(X_n)$ is a martingale and $0\le m $$ \mathbb E[X_n\mid \mathcal F_m]=X_m. $$

This follows by repeated use of the tower property:

$$ \mathbb E[X_n\mid \mathcal F_m]

\mathbb E[\mathbb E[X_n\mid \mathcal F_{n-1}]\mid \mathcal F_m]

\mathbb E[X_{n-1}\mid \mathcal F_m] =\cdots= X_m. $$

Martingale vs Markov chain

Martingale is not the same as Markov chain.

  • Martingale constrains the conditional mean:
$$ \mathbb E[X_{n+1}\mid \mathcal F_n]=X_n. $$
  • Markov property constrains the full conditional distribution:
$$ \mathbb P(X_{n+1}\in A\mid \mathcal F_n)=\mathbb P(X_{n+1}\in A\mid X_n). $$

A process can be one without being the other.