15

I have a very basic doubt. Sorry if this irritates few. I know that Mutual Information value should be greater than 0, but should it be less than 1 ? Is it bounded by any upper value ?

Thanks, Amit.

Amit
  • 743
  • 2
  • 6
  • 16

2 Answers2

24

Yes, it does have an upper bound, but not 1.

The mutual information (in bits) is 1 when two parties (statistically) share one bit of information. However, they can share a arbitrary large data. In particular, if they share 2 bits, then it is 2.

The mutual information is bounded from above by the Shannon entropy of probability distributions for single parties, i.e. $I(X,Y) \leq \min \left[ H(X), H(Y) \right]$ .

Peter Ellis
  • 16,522
  • 1
  • 44
  • 82
Piotr Migdal
  • 5,586
  • 2
  • 26
  • 70
  • If the two parties $X,Y$ are binary variables i.e. each has only two possible outcomes _{0,1}_, then entropies $H(X), H(Y)$ max out at $1$ when $P(X)=0.5$ and $P(Y)=0.5$. Thus, maximum mutual information for two binary variables is $1$ – Akseli Palén Jul 04 '19 at 14:43
1

It depends on whether the alphabet of interest is finite with a known finite cardinality $K$, a finite but unknown cardinality $K$, or an infinite countable alphabet. If you are talking about mutual information (there is a confusion in names, for example, mutual information, information gain, information gain ratio, etc.), then the answer is YES if $K$ is known, NO if$K$ is unknown or infinite - mutual information is unbounded on a countable alphabet!

The answer provided above is incorrect because $I(X,Y)\leq \min(H(X),H(Y))$ is an incorrect statement. This is easily seen since $H(X)$ and $H(Y)$ may be arbitrarily large. That said however it must be mentioned that the author of the answer above may be thinking $K$ being a known integer.