Exam Pattern: Best Linear Prediction
Compute the Best Linear Predictor $P(X_{n+h} \mid X_1, \dots, X_n)$ and its MSE for a given process.
Setup
Given a stationary process $\{X_t\}$ with known ACVF $\gamma_X(h)$, and a set of observed variables, find:
- $P_n X_{n+h} = a_1 X_{i_1} + a_2 X_{i_2} + \cdots$
- $\text{MSE} = \mathbb{E}[(X_{n+h} - P_n X_{n+h})^2]$
Method
Step 1: Write the predictor
$$\hat{X} = a_1 X_{i_1} + a_2 X_{i_2} + \cdots + a_k X_{i_k}$$Step 2: Apply orthogonality conditions
$$\mathbb{E}[(X_{n+h} - \hat{X}) \cdot X_{i_j}] = 0 \quad \text{for each } j = 1, \dots, k$$This gives $k$ equations in $k$ unknowns.
Step 3: Expand using ACVF
Each equation becomes:
$$\gamma_X(n+h - i_j) = \sum_{m=1}^k a_m \gamma_X(i_m - i_j)$$Step 4: Solve the linear system
$$\Gamma \mathbf{a} = \boldsymbol{\gamma}$$Step 5: Compute MSE
$$\text{MSE} = \gamma_X(0) - \sum_{m=1}^k a_m \gamma_X(n+h - i_m)$$Worked Example: MA(1) Prediction
$X_t = Z_t + \theta Z_{t-1}$, $\{Z_t\} \sim \text{WN}(0, \sigma^2)$.
Find $P(X_3 \mid X_4, X_5)$.
ACVF: $\gamma(0) = \sigma^2(1+\theta^2)$, $\gamma(1) = \sigma^2\theta$, $\gamma(h) = 0$ for $|h| > 1$.
Predictor: $P(X_3 \mid X_4, X_5) = a_1 X_4 + a_2 X_5$
Orthogonality conditions:
$$\mathbb{E}[(X_3 - a_1 X_4 - a_2 X_5)X_4] = 0 \implies \gamma(1) = a_1 \gamma(0) + a_2 \gamma(1)$$$$\mathbb{E}[(X_3 - a_1 X_4 - a_2 X_5)X_5] = 0 \implies \gamma(2) = a_1 \gamma(1) + a_2 \gamma(0)$$Since $\gamma(2) = 0$:
$$\sigma^2\theta = a_1 \sigma^2(1+\theta^2) + a_2 \sigma^2\theta$$$$0 = a_1 \sigma^2\theta + a_2 \sigma^2(1+\theta^2)$$From the second equation: $a_2 = -\frac{a_1 \theta}{1+\theta^2}$
Substitute into the first: solve for $a_1$, then $a_2$.
MSE: $\gamma(0) - a_1\gamma(1) - a_2\gamma(2) = \sigma^2(1+\theta^2) - a_1\sigma^2\theta$
ARMA(1,1) Prediction
For $X_t - \phi X_{t-1} = Z_t + \theta Z_{t-1}$, the approximation for large $n$:
$$P_n X_{n+1} \approx \phi X_n + \theta(X_n - P_{n-1}X_n)$$This recursive formula uses the one-step prediction error from the previous step.
Exam Checklist
- Write the ACVF of the given process
- Set up $P = a_1 X_{i_1} + \cdots + a_k X_{i_k}$
- Write $k$ orthogonality equations
- Expand each equation using $\gamma_X$ values
- Solve the $k \times k$ linear system for $a_1, \dots, a_k$
- Compute MSE $= \gamma(0) - \sum a_m \gamma(n+h-i_m)$