Modeling and Reasoning with Bayesian Networks

Modeling and Reasoning with Bayesian Networks

UCLA CS 262A

Scott Mueller (lvl 19)
Chapter 3

Preview this deck

Conditional entropy

Front

Star 0%
Star 0%
Star 0%
Star 0%
Star 0%

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Active users

4

All-time users

4

Favorites

0

Last updated

3 years ago

Date created

Dec 31, 2021

Cards (2)

Chapter 3

(2 cards)

Conditional entropy

Front

Quantifies average uncertainty about \(X\) after observing \(Y\):

$$\text{ENT}(X|Y) = \sum_y Pr(y) \text{ENT}(X|y),$$

where

$$\text{ENT}(X|y) = -\sum_x Pr(x|y) \log_2 Pr(x|y).$$

Back

Entropy

Front

Uncertainty about a variable \(X\) is quantified using entropy:

$$\text{ENT}(X) = -\sum_x Pr(x) \log_2 Pr(x),$$

where \(0 \log 0 = 0\) by convention

Back