0

For random variable $X$, entropy is calculated as $H(X) =-\sum_i p(x_i) \ln p(x_i)$. Differential entropy, not shown, is its continuous counterpart. Both use log probabilities which convert the original probabilities, whose range was $[0,1]$, to the log scale which changes the range to $[-\infty,0]$.

It not uncommon to see exponential entropy being used instead, $\exp( H(X))$, I think to circumvent a singularity at $H(X)=0$. As we know, taking the exponential of a log will undo the log.

Given that the raw probabilities have been upgraded to log probabilities in the entropy measure, what is the intuition behind further upgrading the entropy measure by undoing its logs using $\exp$?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
develarist
  • 3,009
  • 8
  • 31
  • 3
    $\exp(H)$ has an easy interpretation in terms of the (approximate) number of equally common categories. Suppose $k = 1, \dots, K$ categories are equally common; then $H = \sum^K_{k=1} p_k \ln (1/p_k) = \sum^K_{k=1} (1/K) \ln K = \ln K$ and exponentiating that gets you back to $K$. In some fields this is called a numbers equivalent. The interpretation is useful even if probabilities do vary somewhat. More broadly, whenever logarithms appear, it is never surprising that exponentials also have a role. – Nick Cox Aug 20 '20 at 19:21
  • "Given that the raw probabilities have been upgraded to log probabilities" is not a helpful characterisation. I don't know what would be better for your purposes. – Nick Cox Aug 20 '20 at 19:27
  • your example shows how exponential entropy is handy for the equally-weighted probabilities case. what does it do for lower entropy cases (not equally weighted)? – develarist Aug 20 '20 at 19:32
  • 1
    @NickCox this seems to be an answer, isn't it? – Tim Aug 20 '20 at 19:33
  • i'm not challenging the answer because I did vote it, since the best answers anyway are examples. just trying to explore the intuition more if there are more examples (answers) to throw at me/me at – develarist Aug 20 '20 at 19:36
  • @Tim Thanks for the compliment, but as now explicit this turns out to be a duplicate -- of a question asked by the OP! – Nick Cox Aug 21 '20 at 11:20

0 Answers0