请输入您要查询的字词:

 

单词 RelativeEntropy
释义

relative entropy


Let p and q be probability distributions with supportsMathworldPlanetmathPlanetmath 𝒳 and 𝒴 respectively, where 𝒳𝒴. The relative entropy or Kullback-Leibler distancePlanetmathPlanetmath between two probability distributions p and q is defined as

D(p||q):=x𝒳p(x)logp(x)q(x).(1)

While D(p||q) is often called a distance, it is not a true metric because it is not symmetricPlanetmathPlanetmath and does not satisfy the triangle inequalityMathworldMathworldPlanetmath. However, we do have D(p||q)0 with equality iff p=q.

-D(p||q)=-x𝒳p(x)logp(x)q(x)(2)
=x𝒳p(x)logq(x)p(x)(3)
log(x𝒳p(x)q(x)p(x))(4)
=log(x𝒳q(x))(5)
log(x𝒴q(x))(6)
=0(7)

where the first inequalityMathworldPlanetmath follows from the concavity of log(x) and the second from expanding the sum over the support of q rather than p.

Relative entropy also comes in a continuousMathworldPlanetmathPlanetmath version which looks just as one might expect. For continuous distributions f and g, 𝒮 the support of f, we have

D(f||g):=𝒮flogfg.(8)
Titlerelative entropy
Canonical nameRelativeEntropy
Date of creation2013-03-22 12:19:32
Last modified on2013-03-22 12:19:32
OwnerMathprof (13753)
Last modified byMathprof (13753)
Numerical id10
AuthorMathprof (13753)
Entry typeDefinition
Classificationmsc 60E05
Classificationmsc 94A17
SynonymKullback-Leibler distance
Related topicMetric
Related topicConditionalEntropy
Related topicMutualInformation
Related topicProofOfGaussianMaximizesEntropyForGivenCovariance
随便看

 

数学辞典收录了18232条数学词条,基本涵盖了常用数学知识及数学英语单词词组的翻译及用法,是数学学习的有利工具。

 

Copyright © 2000-2023 Newdu.com.com All Rights Reserved
更新时间:2025/5/4 8:12:28