请输入您要查询的字词:

 

单词 MutualInformation
释义

mutual information


Let (Ω,,μ) be a discrete probability spaceMathworldPlanetmath, and let X and Y be discrete random variables on Ω.

The mutual informationMathworldPlanetmath I[X;Y], read as “the mutual information of X and Y,” is defined as

I[X;Y]=xΩyΩμ(X=x,Y=y)logμ(X=x,Y=y)μ(X=x)μ(Y=y)
=D(μ(x,y)||μ(x)μ(y)).

where D denotes the relative entropyMathworldPlanetmath.

Mutual information, or just information, is measured in bits if the logarithm is to the base 2, and in “nats” when using the natural logarithm.

Discussion

The most obvious characteristic of mutual information is that it depends on both X and Y. There is no information in a vacuum—information is always about something. In this case, I[X;Y] is the information in X about Y. As its name suggests, mutual information is symmetricMathworldPlanetmathPlanetmathPlanetmath, I[X;Y]=I[Y;X], so any information X carries about Y, Y also carries about X.

The definition in terms of relative entropy gives a useful interpretationMathworldPlanetmathPlanetmath of I[X;Y] as a kind of “distance” between the joint distributionPlanetmathPlanetmath μ(x,y) and the product distribution μ(x)μ(y). Recall, however, that relative entropy is not a true distance, so this is just a conceptual tool. However, it does capture another intuitive notion of information. Remember that for X,Y independentPlanetmathPlanetmath, μ(x,y)=μ(x)μ(y). Thus the relative entropy “distance” goes to zero, and we have I[X;Y]=0 as one would expect for independent random variables.

A number of useful expressions, most apparent from the definition, relate mutual information to the entropyMathworldPlanetmath H:

0I[X;Y]H[X](1)
I[X;Y]=H[X]-H[X|Y](2)
I[X;Y]=H[X]+H[Y]-H[X,Y](3)
I[X;X]=H[X](4)

Recall that the entropy H[X] quantifies our uncertainty about X. The last line justifies the description of entropy as “self-information.”

Historical Notes

Mutual information, or simply information, was introduced by Shannon in his landmark 1948 paper “A Mathematical Theory of Communication.”

Titlemutual information
Canonical nameMutualInformation
Date of creation2013-03-22 12:37:35
Last modified on2013-03-22 12:37:35
Ownerdrummond (72)
Last modified bydrummond (72)
Numerical id4
Authordrummond (72)
Entry typeDefinition
Classificationmsc 94A17
Synonyminformation
Related topicRelativeEntropy
Related topicEntropy
Related topicShannonsTheoremEntropy
Related topicDynamicStream
Definesinformation
Definesmutual information
随便看

 

数学辞典收录了18232条数学词条,基本涵盖了常用数学知识及数学英语单词词组的翻译及用法,是数学学习的有利工具。

 

Copyright © 2000-2023 Newdu.com.com All Rights Reserved
更新时间:2025/5/4 6:40:43