Definition
The Kullback–Leibler divergence from $q(z)$ to $p(z\mid x)$ is
$$D_{\text{KL}}(P \parallel Q) = \int_{\mathcal{X}} p(x) \log \frac{p(x)}{q(x)} dx = \mathbb{E}_{P}\left[\log p(x) - \log q(x)\right]$$Key property
$$D_{\text{KL}}(q\|p) \ge 0, \quad D_{\text{KL}}(q\|p)=0 \iff q=p.$$KL Divergence is used as the objective function to measure the approximated distribution $q(z)$ with real Posterior Distribution $p(z|x)$.
Related
- Variational Inference
- I-projection
- ELBO
- STA414 W6+A3