2

I know that $\sum P(x) log \left( \frac{P(x)}{Q(x)} \right)$ is the kl-divergence. I'd like to know if there is a name for $\sum P(x) \left( \frac{P(x)}{Q(x)} \right)$ (no log), but couldn't find one.

Any pointers?

Thanks!

Tal Galili
  • 19,935
  • 32
  • 133
  • 195
  • https://en.wikipedia.org/wiki/R%C3%A9nyi_entropy#R%C3%A9nyi_divergence Renyi divergence for $\alpha = 2$? – Simone Feb 05 '21 at 07:01
  • Thanks, it still seems to have a log in it. – Tal Galili Feb 05 '21 at 12:23
  • Sure true. The Chi-Square answer below is a better one. Though, the logarithm is outside the summation. And the logarithm is a monotonic function. So you could say that your formula $F = \exp{ D_2} $ where $D_2$ is the Renyi entropy. Anyway when you set $\alpha$ to 2 you get things related to Chi-square, Gini, for example. – Simone Feb 05 '21 at 14:00

1 Answers1

4

It is basically $\chi^2(P,Q)+1$, where the chi-squared divergence between two distributions is defined as $\chi^2(P,Q)=\sum_x {(P(x)-Q(x))^2\over Q(x)}=\sum_x {P^2(x)\over Q(x)}-1$.

Note it is not zero if $P=Q$, so can't quite call it a distance unlike KL (using log 1=0).

user43170
  • 56
  • 1
  • Thanks, that's very helpful. I agree it won't be 0 if P=Q. But the original chi-square will be (Since sum of P will be 1). I'm not sure if it holds the other conditions of a distance, but it's not that critical for my own needs. Thanks again! – Tal Galili Feb 07 '21 at 10:03