Autocovariance Function (ACVF)

The autocovariance function measures the linear dependence between values of a stationary time series at different lags.

1. Definition

For a weakly stationary process $\{X_t\}$:

$$\gamma_X(h) = \text{Cov}(X_t, X_{t+h})$$

Since stationarity ensures the covariance depends only on the lag $h$ (not on $t$), this is well-defined.

2. Properties

  • $\gamma_X(0) = \text{Var}(X_t) \geq 0$
  • $\gamma_X(h) = \gamma_X(-h)$ (symmetric)
  • $|\gamma_X(h)| \leq \gamma_X(0)$
  • $\gamma_X(h)$ can be negative

3. Key Examples

White Noise $\{Z_t\} \sim \text{WN}(0, \sigma^2)$

$$\gamma_Z(h) = \begin{cases} \sigma^2 & h = 0 \\ 0 & h \neq 0 \end{cases}$$

AR(1): $X_t = \phi X_{t-1} + Z_t$, $|\phi| < 1$

$$\gamma_X(h) = \frac{\sigma^2 \phi^{|h|}}{1 - \phi^2}$$

MA(1): $X_t = Z_t + \theta Z_{t-1}$

$$\gamma_X(h) = \begin{cases} \sigma^2(1 + \theta^2) & h = 0 \\ \sigma^2 \theta & |h| = 1 \\ 0 & |h| > 1 \end{cases}$$

4. ACVF of a Linear Combination

If $X_t = \sum_{j} a_j Y_{t-j}$ where $\{Y_t\}$ is stationary with ACVF $\gamma_Y$, then

$$\gamma_X(h) = \sum_i \sum_j a_i a_j \gamma_Y(h + i - j)$$

5. Zero-Mean Simplification

If $\mu_X = 0$, then

$$\gamma_X(0) = \mathbb{E}[X_t^2], \qquad \gamma_X(h) = \mathbb{E}[X_t X_{t+h}]$$

This form is often the quickest route for direct ACVF computations in exam questions.

6. Why It Matters

Knowing $\gamma_X(h)$ for all $h$ determines the covariance structure of a weakly stationary series, which feeds directly into $\Gamma_n$, best linear prediction, and prediction interval calculations.

Interactive reference: Theoretical ACF / PACF simulator (AR/MA) (standalone page; Chart.js via CDN).