请输入您要查询的字词:

 

单词 DerivationOfMutualInformation
释义

derivation of mutual information


The maximum likelihood estimater for mutual informationMathworldPlanetmath is identical (except for a scale factor) to the generalized log-likelihood ratio for multinomials and closely related to Pearson’sχ2 test. This implies that the distributionPlanetmathPlanetmathPlanetmath of observed values of mutual information computed using maximum likelihood estimates for probabilities is χ2 distributed except for that scaling factor.

In particular if we sample each of X and Y and combine the samples to form N tuples sampled from X×Y. Now define T(x,y) to be the total number of times the tuple (x,y) was observed. Further define T(x,*) to be the number of times that a tuple starting with x was observed and T(*,y) to be the number of times that a tuple ending with y was observed. Clearly, T(*,*) is just N, the number of tuples in the sample. From the definition, the generalized log-likelihood ratio test of independence for X and Y (based on the sample of tuples) is

-2logλ=2xyT(x,y)logπx|yμx

where

πx|y=T(x,y)/xT(x,y)

and

μx=T(x,*)/T(*,*)

This allows the log-likelihood ratio to be expressed in terms of row and column sums,

-2logλ=2xyT(x,y)logT(x,y)T(*,*)T(x,*)T(*,y)

This reduces to the following expression in terms of maximum likelihood estimates of cell, row and column probabilities,

-2logλ=2xyT(x,y)logπxyμ*yμx*

This can be rearranged into

-2logλ=2N[xyπxylogπxyxμx*logμx*yμ*ylogμ*y]=2NI^(X;Y)

where the hat indicates a maximum likelihood estimation of I(X;Y).

This also gives the asymptotic distribution of I^(X;Y) as 2N times a χ2 deviate.

随便看

 

数学辞典收录了18232条数学词条,基本涵盖了常用数学知识及数学英语单词词组的翻译及用法,是数学学习的有利工具。

 

Copyright © 2000-2023 Newdu.com.com All Rights Reserved
更新时间:2025/5/3 12:59:19