6

If I have a distribution $f(x)$ over the real line where the support is the whole line, does the Shannon Entropy uniquely characterise $f$? I.e., do we have $H(f) = H(f^*)$ implies $f = f^*$? (The reverse is obviously true.)

whuber
  • 281,159
  • 54
  • 637
  • 1,101
M.cadirci
  • 73
  • 4
  • 2
    You have a good answer provided, if you're ok with it, can you please accept and/or upvote it? And, just a marginal example: let your RV be a constant, e.g. X=2 or X=3. The distribution is different, but for both the entropy is $0$. – gunes Feb 22 '20 at 21:37
  • Thank you so much for remind me upvote. I am happy with both of answers. – M.cadirci Feb 22 '20 at 22:41
  • 1
    You can also click the gray tick under the arrow in the answer to accept the answer. – gunes Feb 22 '20 at 22:51

1 Answers1

7

The answer is in the negative. For any real number $a$ define the function

$$f_a(x) = f(x-a).$$

It is clear that when $f$ is a distribution function, so is $f_a;$ that when $f$ is supported on the real line, so is $f_a;$ and that both $f$ and $f_a$ have equal entropy. For $a\ne 0$ it is impossible that $f=f_a,$ though, for if so, $f$ would be periodic with period $a$ and therefore the total probability would either be zero or infinite, which is not possible for any probability distribution.

whuber
  • 281,159
  • 54
  • 637
  • 1,101