KL Divergence
- KL-Divergence = Kullback-Leibler Divergence
- KL divergence in it's core is the ration of two probability distribution
- So,
$$
\begin{align*}
KL &= \frac{1}{N} log \frac{\prod_x P(X = x)}{\prod_x Q(X = x)} \\ &= \frac{1}{N} log \frac{P_1 * P_1 * P_2 * P_1 * P_1}{Q_1 * Q_2 * Q_1 * Q_2 * Q_2} \\ &= \frac{1}{N} log \frac{P_1^{N_1} P_2^{N_2}}{Q_1^{N_1} Q_2^{N_2}} \\ &= \frac{1}{N} (N_1 log P_1 + N_2 log P_2 - N_1 log Q_1 - N_2 log Q_2) \\ &= P_1 log P_1 + P_2 log P_2 - P_1 log Q_1 - P_2 log Q_2 \\ &= P_1 log \frac{P_1}{Q_1} + P_2 log \frac{P_2}{Q_2} \\ &= \sum_i P_i log \frac{P_i}{Q_i}
\end{align*}
$$
[!def] KL Divergence = ?
$$
D_{KL} (P||Q) = \sum_i P_i log \frac{P_i}{Q_i}
$$
$$
D_{KL} (P||Q) = \int P(x) log \frac{P(x)}{Q(x)}
$$