conditional entropy
Definition (Discrete)
Let be a discrete probability space![]()
, and let and be discrete random variables on .
The conditional entropy , read as “the conditional entropy of given ,” is defined as
| (1) |
where denotes the conditional probability![]()
. is nonzero in thediscrete case
Discussion
The results for discrete conditional entropy will be assumed to hold for the continuous case unless we indicate otherwise.
With the joint entropy and a function, we have the following results:
| (2) | ||||
| (3) | ||||
| (4) | ||||
| (5) | ||||
| (6) |
The conditional entropy may be interpreted as the uncertainty in given knowledge of . (Try reading the above equalities and inequalities with this interpretation![]()
in mind.)