I have a very basic doubt. Sorry if this irritates few. I know that Mutual Information value should be greater than 0, but should it be less than 1 ? Is it bounded by any upper value ?
Thanks, Amit.
I have a very basic doubt. Sorry if this irritates few. I know that Mutual Information value should be greater than 0, but should it be less than 1 ? Is it bounded by any upper value ?
Thanks, Amit.
Yes, it does have an upper bound, but not 1.
The mutual information (in bits) is 1 when two parties (statistically) share one bit of information. However, they can share a arbitrary large data. In particular, if they share 2 bits, then it is 2.
The mutual information is bounded from above by the Shannon entropy of probability distributions for single parties, i.e. $I(X,Y) \leq \min \left[ H(X), H(Y) \right]$ .
It depends on whether the alphabet of interest is finite with a known finite cardinality $K$, a finite but unknown cardinality $K$, or an infinite countable alphabet. If you are talking about mutual information (there is a confusion in names, for example, mutual information, information gain, information gain ratio, etc.), then the answer is YES if $K$ is known, NO if$K$ is unknown or infinite - mutual information is unbounded on a countable alphabet!
The answer provided above is incorrect because $I(X,Y)\leq \min(H(X),H(Y))$ is an incorrect statement. This is easily seen since $H(X)$ and $H(Y)$ may be arbitrarily large. That said however it must be mentioned that the author of the answer above may be thinking $K$ being a known integer.