Respuesta :

The mutual information [tex]MI[/tex]

[tex]MI=\displaystyle\sum_{x,y}p(x,y)\ln\dfrac{p(x,y)}{p(x)p(y)}[/tex]

First compute the marginal distributions for [tex]X[/tex] and [tex]Y[/tex].


[tex]p(x)=\begin{cases}\frac23&\text{for }x=0\\\frac13&\text{for }x=1\end{cases}[/tex]

[tex]Y[/tex] has the same marginal distribution (replace [tex]x[/tex] with [tex]y[/tex] above).

The support for the joint PMF are the points (0,0), (1,0), and (0,1), so this is what you sum over. We get

[tex]MI=p(0,0)\ln\dfrac{p(0,0)}{p_X(0)p_Y(0)}+p(1,0)\ln\dfrac{p(1,0)}{p_X(1)p_Y(0)}+p(0,1)\ln\dfrac{p(0,1)}{p_X(0)p_Y(1)}[/tex]
[tex]MI=\dfrac13\ln\dfrac{\frac13}{\frac23\cdot\frac23}+\dfrac13\ln\dfrac{\frac13}{\frac13\cdot\frac23}+\dfrac13\ln\dfrac{\frac13}{\frac23\cdot\frac13}[/tex]
[tex]MI\approx0.174[/tex]


Be sure to check how mutual information is defined in your text/notes. I used the natural logarithm above.
ACCESS MORE
EDU ACCESS