2

I am trying to compare two one-dimensional distributions. I am using Kullback-Leibler divergence function for this but it requires me to have both the distributions of equal length. I am not sure how I can make the distributions of equal length without disturbing the original distributions(i.e. if I add zeroes to the distribution smaller in size, the probability of value 0 in that distribution will become very high.)

The probability densities of both my distributions are as shown in the below figure, Probability density functions

Note that, the N=235 refers to the bigger distribution size and doesn't imply that both distributions are of size 235.

Please suggest some way that I can use Kullback-Leibler divergence for this problem. Some input on other methods/tests which can be used for comparison will be appreciated as well.

  • 1
    Your description is a bit confusing. A [distribution](https://en.wikipedia.org/wiki/Probability_distribution) doesn't have an $n$. Are you using the word distribution to refer to a [sample](https://en.wikipedia.org/wiki/Sample_%28statistics%29), or to the length of a sequence of values where the density has been estimated, or something else? – Glen_b Mar 26 '16 at 00:59
  • 1
    @Glen_b yes my N represents length of sequence of values where the density has been estimated. – Kunal Parmar Mar 26 '16 at 21:35

1 Answers1

0

Why do you want to use KL divergence for this problem? You basically have a 2-sample problem, and want to test if the distributions are equal. Depending on the specifis<of your problem, which you did not tell us, there are many methods. And many posts on this site, some of them

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467