Fisher Information
Definition
The Fisher information of a parameter $\theta$ in a model $p(x \mid \theta)$ is
$$ \mathcal{I}(\theta) = \mathbb{E}\left[\left(\frac{\partial}{\partial \theta} \log p(X \mid \theta)\right)^2\right] = -\mathbb{E}\left[\frac{\partial^2}{\partial \theta^2} \log p(X \mid \theta)\right]. $$Interpretation
Fisher information measures how much information an observation $X$ carries about $\theta$. Higher Fisher information means more precise estimation.
Cramer-Rao Lower Bound
For any unbiased estimator $\hat{\theta}$:
$$ \text{Var}(\hat{\theta}) \geq \frac{1}{n\,\mathcal{I}(\theta)}. $$