Mutual Information / Entropy Calculation Help
- by Fillip
Hi,
Hoping someone can give me some pointers with this entropy problem.
Say X is chosen randomly from the uniform integer distribution 0-32 (inclusive).
I calculate the entropy, H(X) = 32 bits, as each Xi has equal probability of occurring.
Now, say the following pseudocode executes.
int r = rand(0,1); // a random integer 0 or 1
r = r * 33 + X;
How would I work out the mutual information between the two variables r and X?
Mutual Information is defined as I(X; Y) = H(X) - H(X|Y) but I don't really understand how to apply the conditional entropy H(X|Y) to this problem.
Thanks