I want to sample from a distribution which has fixed to a given values mean(=0), standard deviation(=1), skewness(=0) and kurtosis. I also want this distribution to be as general as possible, i.e. to have Kullback-Leibler divergence from uniform distribution as small as possible (this condition is equivalent to principle of Maximum Entropy), just like the normal distribution would be for kurtosis=3.
I know, that most probably there is no hope to have closed form for such distribution in general case. I am only interested in sampling from it. I accept reasonable numerical approximations.
I am not overly concerned with efficiency - I can wait about 2 days to have just 2000 samples.
Years age I wrote a sort of genetic algorithm to address this problem:
- Start with random sample of 2000 values sampled from uniform distribution, which will be later called population.
- Create many (about 100) variants of the population by first removing random subsample of - say - 200 samples from the population, and then reinserting the same number of random numbers sampled from uniform distribution.
- Find the variant which has mean, sd, skewness and kurtosis as close as possible to the target parameters (choice of metric should not be critical to this algorithm, since all conditions are independent from each other).
- Continue on step 2 with population being the best chosen variant from step 3.
The algorithm is slow, but in the end it yields what I understand a good approximation of maximum entropy distribution.
Is this algorithm correct? Or is there any better method for getting general leptokurtic distributions?