释义 |
entropy (Shannon entropy) For a random variable X which takes k values with probabilities p1, p2, …pk, the entropy of X is given by This is the expected value of the information when sampling X. It is maximal when the probability distribution is uniform. See Shannon's theorem.
|