Autoregressive Process
An autoregressive process of order $p$ expresses the current value as a linear combination of $p$ past values plus white noise.
1. Definition
$\{X_t\}$ is an AR($p$) process if
$$X_t = \phi_1 X_{t-1} + \phi_2 X_{t-2} + \cdots + \phi_p X_{t-p} + Z_t$$where $\{Z_t\} \sim \text{WN}(0, \sigma^2)$.
Using the Backshift Operator, this is written compactly as
$$\phi(B)X_t = Z_t, \quad \phi(B) = 1 - \phi_1 B - \phi_2 B^2 - \cdots - \phi_p B^p$$Here $\phi_1, \dots, \phi_p$ are the AR coefficients and $p$ is the order of the model.
2. AR(1) Special Case
$$X_t = \phi X_{t-1} + Z_t$$Stationarity condition: $|\phi| < 1$.
ACVF: $\gamma_X(h) = \frac{\sigma^2 \phi^h}{1 - \phi^2}$ for $h \geq 0$.
ACF: $\rho_X(h) = \phi^h$ — exponential decay.
3. Causality and Stationarity
The AR($p$) process is causal, and therefore weakly stationary, if and only if all roots of the characteristic polynomial
$$\phi(z) = 1 - \phi_1 z - \phi_2 z^2 - \cdots - \phi_p z^p = 0$$lie outside the unit circle $|z| > 1$.
If the root condition fails, the process may be non-stationary or admit only a non-causal stationary representation.
4. ACF Behavior
- AR(1): ACF decays exponentially as $\phi^h$
- AR($p$): ACF decays as a mixture of exponentials/damped oscillations
- Key identifier: ACF tails off gradually (contrast with Moving Average Process where ACF cuts off)
5. Parameter Estimation
For AR(1), parameters can be estimated from sample ACVF:
$$\hat{\phi}_1 = \frac{\hat{\gamma}_X(1)}{\hat{\gamma}_X(0)}, \quad \hat{\sigma}^2 = \hat{\gamma}_X(0)(1 - \hat{\phi}_1^2)$$Alternatively, fit via OLS regression of $X_t$ on $X_{t-1}$.