Mutual Information / Entropy Calculation Help
Posted
by Fillip
on Stack Overflow
See other posts from Stack Overflow
or by Fillip
Published on 2010-05-01T18:37:37Z
Indexed on
2010/05/01
18:47 UTC
Read the original article
Hit count: 248
Hi,
Hoping someone can give me some pointers with this entropy problem.
Say X is chosen randomly from the uniform integer distribution 0-32 (inclusive).
I calculate the entropy, H(X) = 32 bits, as each Xi has equal probability of occurring.
Now, say the following pseudocode executes.
int r = rand(0,1); // a random integer 0 or 1
r = r * 33 + X;
How would I work out the mutual information between the two variables r and X?
Mutual Information is defined as I(X; Y) = H(X) - H(X|Y) but I don't really understand how to apply the conditional entropy H(X|Y) to this problem.
Thanks
© Stack Overflow or respective owner