2

I am building (many years later !) on this SO question: https://stackoverflow.com/questions/22461241/tsallis-entropy-for-continuous-variable-in-r

I should make clear that I'm a stats newbie, exploring it out of curiosity, so forgive me if what follows is a question with an obvious answer.

It it my understanding that Tsallis entropy is always non-negative (i.e. Sq(ρ)≥0).

If so, how come it is permissible for Tsallis entropy for continous variable (1/(q-1) * (1 - integrate.xy(PDF$x, PDF$y^q))) to be negative ?

Little Code
  • 151
  • 3

1 Answers1

2

Why do you expect it to be always non-negative in the continuous case? If you ask the same question for the usual (Shannon) entropy, which is non-negative for discrete variables, but the generalization to continuous variables, Differential entropy, does not share this property. The underlying reason is that while probabilities cannot be larger than one, probability densities can be! For an example take a uniform distribution on the interval $[0, 1/2]$ with probability density $f(x) = 2\cdot \mathbb{1}_{[0,1/2]}(x)$ and calculate its differential entropy.

Then calculate its Tsallis entropy with say, $q=2$, $$ S_2(f)= \frac1{2-1}\left( 1 - \int_0^{1/2} 2^2\; dx\right) = 1-2=-1 $$

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
  • 1
    I had never heard it put that way: “probability density can exceed $1$”. Could that be playing a role in my two copula entropy questions from last year? https://stats.stackexchange.com/questions/511088/mutual-information-relationship-to-copula-entropy-is-borked https://stats.stackexchange.com/questions/510992/copula-entropy-calculation-is-borked – Dave Jan 01 '22 at 16:35