Note 9

  1. Explain why factor model is equivalent to

    $$z \sim \mathcal{N}(0, I)$$

    $$x|z \sim \mathcal{N}(Lz, \Psi)$$
  2. Show that PVE by $j$-th factor in FA via PCA is $\frac{\lambda_j}{\text{tr}(\Sigma)}$ where $\lambda_j$ is $j$-th eval of $\Sigma$.

  3. Is $PVE = \frac{\|L\|_F^2}{\text{tr}(\Sigma)}$ rotation invariant? I.e. will you get the same value of PVE if use $\tilde{L} = LQ$ instead? Are $PVE_j = \frac{\sum_{i=1}^p \ell_{ij}^2}{\text{tr}(\Sigma)}$ rotation invariant?

  4. FA Solution is not guaranteed to exist. Consider:

$$\Sigma = \begin{pmatrix} 1 & 0.9 & 0.7 \\ 0.9 & 1 & 0.4 \\ 0.7 & 0.4 & 1 \end{pmatrix}$ for $x = \begin{pmatrix} x_1 \\ x_2 \\ x_3 \end{pmatrix}$$

We want to use $r=1$, i.e. find $L = \begin{pmatrix} \ell_{11} \\ \ell_{21} \\ \ell_{31} \end{pmatrix} \in \mathbb{R}^{3 \times 1}$ and $\Psi = \text{diag}(\psi_1, \psi_2, \psi_3)$ Such that $\Sigma = LL^T + \Psi$.

  • Show that $\ell_{11}\ell_{21} = 0.9$, $\ell_{21}\ell_{31} = 0.4$, $\ell_{11}\ell_{31} = 0.7$
  • Find value of $\ell_{11}$
  • Show that $\ell_{11} = \text{Cor}(x_1, z_1)$ and explain why there is no solution to $\Sigma = LL^T + \Psi$
  • Find value of $\psi_1$
  • Show that it’s not possible for $\psi_1 = \text{var}(\epsilon_1)$
  1. Finding scores via conditional distribution. Consider $y = \begin{pmatrix} x \\ z \end{pmatrix}$

    • Find joint distribution for $y$
    • Find conditional distribution $z|x$
    • Argue how to use formula for $E(z|x)$ to find scores $z_1 \dots z_n$
    • Why $E(z|x)$ is a good estimate?

Note 10

  1. Assume that $P \in \mathbb{R}^{p \times p}$ is a permutation matrix that permutes $i$-th and $j$-th elements in $\mathbb{R}^p$, i.e.

    If $x = \begin{pmatrix} x_1 \\ \vdots \\ x_i \\ \vdots \\ x_j \\ \vdots \\ x_p \end{pmatrix}$ then $Px = \begin{pmatrix} x_1 \\ \vdots \\ x_j \\ \vdots \\ x_i \\ \vdots \\ x_p \end{pmatrix}$

    • Write down $P$ explicitly

    • Write down $P^{-1}$ explicitly

      (Hint: note that $P^{-1}(Px) = P^{-1} \begin{pmatrix} x_1 \\ \vdots \\ x_j \\ \vdots \\ x_p \end{pmatrix} = x = \begin{pmatrix} x_1 \\ \vdots \\ x_i \\ \vdots \\ x_p \end{pmatrix}$)

    • If $z = \begin{pmatrix} z_1 \\ \vdots \\ z_p \end{pmatrix}$ and $L = (\ell_1 \dots \ell_p)$ write $\tilde{z} = Pz$ and $\tilde{L} = L P^{-1}$ in terms of $L$ and $z$

    • Using equation $Lz = \ell_1 z_1 + \dots + \ell_p z_p$ show that $\tilde{L}\tilde{z} = Lz$

  2. Uncorrelated $\nRightarrow$ independent Consider random vector $x = \begin{pmatrix} x_1 \\ x_2 \end{pmatrix}$ with joint distribution

    $$x = \begin{cases} (0, 1) \\ (0, -1) \\ (1, 0) \\ (-1, 0) \end{cases} \text{ with probability } 1/4$$
    • Show that $\text{cor}(x_1, x_2) = 0$
    • Find marginal distributions $f_{x_1}(x_1), f_{x_2}(x_2)$
    • Show that $x_1$ and $x_2$ are not independent i.e. $f_x(x_1, x_2) = f_{x_1}(x_1) \cdot f_{x_2}(x_2)$
  3. If $y_1 \sim (\mu_1, \sigma_1^2)$ and $y_2 \sim (\mu_2, \sigma_2^2)$ derive that

    $$\mathcal{K}(y_1 + y_2) = \frac{\sigma_1^4 \mathcal{K}(y_1) + \sigma_2^4 \mathcal{K}(y_2)}{(\sigma_1^2 + \sigma_2^2)^2}$$