Hamiltonian Monte Carlo
Hamiltonian Monte Carlo (HMC) is an MCMC method that uses gradient information to propose moves, reducing the random-walk behavior of standard Metropolis-Hastings Algorithm.
1. Idea
Introduce an auxiliary momentum variable $v$ and define a joint distribution:
$$p(x, v) \propto \exp\left(-U(x) - \frac{1}{2}v^\top v\right)$$where $U(x) = -\log \bar{p}(x)$ is the potential energy and $\frac{1}{2}v^\top v$ is the kinetic energy.
2. Algorithm
At each iteration:
- Resample momentum: Draw $v \sim \mathcal{N}(0, I)$
- Leapfrog integration: Simulate Hamiltonian dynamics for $L$ steps with step size $\varepsilon$:
- MH correction: Accept the proposal $(x', v')$ with probability $\min\{1, \exp(-H(x', v') + H(x, v))\}$ where $H(x,v) = U(x) + \frac{1}{2}v^\top v$.
3. Why It Works
Hamiltonian dynamics preserve the total energy $H(x,v)$, so in exact simulation the acceptance rate would be 100%. The leapfrog integrator introduces small errors, corrected by the MH step.
4. Advantages over Random-Walk MH
- Proposals can travel far from the current state while maintaining high acceptance
- Explores the target distribution much more efficiently in high dimensions
- Requires gradient $\nabla \log p(x)$, which is available via automatic differentiation