relative entropy
Let and be probability distributions with supports![]()
and respectively, where . The relative entropy or Kullback-Leibler distance
between two probability distributions and is defined as
| (1) |
While is often called a distance, it is not a true metric because it is not symmetric and does not satisfy the triangle inequality
![]()
![]()
. However, we do have with equality iff .
| (2) | ||||
| (3) | ||||
| (4) | ||||
| (5) | ||||
| (6) | ||||
| (7) |
where the first inequality![]()
follows from the concavity of and the second from expanding the sum over the support of rather than .
Relative entropy also comes in a continuous![]()
version which looks just as one might expect. For continuous distributions and , the support of , we have
| (8) |
| Title | relative entropy |
| Canonical name | RelativeEntropy |
| Date of creation | 2013-03-22 12:19:32 |
| Last modified on | 2013-03-22 12:19:32 |
| Owner | Mathprof (13753) |
| Last modified by | Mathprof (13753) |
| Numerical id | 10 |
| Author | Mathprof (13753) |
| Entry type | Definition |
| Classification | msc 60E05 |
| Classification | msc 94A17 |
| Synonym | Kullback-Leibler distance |
| Related topic | Metric |
| Related topic | ConditionalEntropy |
| Related topic | MutualInformation |
| Related topic | ProofOfGaussianMaximizesEntropyForGivenCovariance |