Best Linear Predictor — $P_n X_{n+h}$

Setup

$\{X_t\}$ is a stationary time series with mean $\mu$ and ACVF $\gamma(h)$. We want to predict $X_{n+h}$ using $X_n, X_{n-1}, \ldots, X_1$.

Formula

$$P_n X_{n+h} = \text{Pred}(X_{n+h} | X_n, \ldots, X_1) := \mu + (a_1, \ldots, a_n) \begin{pmatrix} X_n - \mu \\ \vdots \\ X_1 - \mu \end{pmatrix}$$

where $\mathbf{a}_n = (a_1, \ldots, a_n)'$ solves:

$$\Gamma_n \mathbf{a}_n = \boldsymbol{\gamma}_n(h) := (\gamma(h), \gamma(h+1), \ldots, \gamma(h+n-1))'$$

Notation

  • $\Gamma_n$ — the $n \times n$ covariance matrix with $(\Gamma_n)_{ij} = \gamma(|i-j|)$
  • $\boldsymbol{\gamma}_n(h)$ — the vector of covariances between $X_{n+h}$ and $(X_n, X_{n-1}, \ldots, X_1)$

MSE

$$\text{MSE} = E[(X_{n+h} - P_n X_{n+h})^2] = \gamma(0) - \mathbf{a}_n' \boldsymbol{\gamma}_n(h)$$

Expanded Form

$$P_n X_{n+h} = \underbrace{\mu(1 - a_1 - \cdots - a_n)}_{a_0} + a_1 X_n + a_2 X_{n-1} + \cdots + a_n X_1$$

The intercept $a_0$ is determined by the constraint that the predictor is unbiased.