Entropy
- Entropy is the Expected Value of Surprise
- Less the probability
- More the Surprise
- More the Entropy
$$
\begin{align*}
E(surprise) &= \sum log_2 \frac{1}{P(x)} P(x) \\&= \sum P(x) (log_2 1 - log_2 P(x)) \\&= \sum P(x) (0 - log_2 P(x)) \\&= \sum -P(x) log_2 P(x) \\&= Entropy
\end{align*}
$$
>[!def] Entropy Formula
>$$
> Entropy = \sum - P(x) log_2 P(x)
>$$
- Used 2-based log, as there are 2 class
- For N class, we will use $log_n$
[!question] How entropy can be used?
Entropy is used to quantify similarity or differentiate in the number of classes
- Low entropy means high difference in the classes, i.e., 1 class A, 100 class B
- High entropy means low difference in the classes, i.e., 49 class A, 51 class B