Chapter 3

Conditional entropy

Front

Active users

4

All-time users

4

Favorites

0

Last updated

3 years ago

Date created

Dec 31, 2021

Chapter 3

(2 cards)

Conditional entropy

Quantifies average uncertainty about \(X\) after observing \(Y\):

$$\text{ENT}(X|Y) = \sum_y Pr(y) \text{ENT}(X|y),$$

where

$$\text{ENT}(X|y) = -\sum_x Pr(x|y) \log_2 Pr(x|y).$$

Entropy

Uncertainty about a variable \(X\) is quantified using entropy:

$$\text{ENT}(X) = -\sum_x Pr(x) \log_2 Pr(x),$$

where \(0 \log 0 = 0\) by convention