Is there an efficient, numerically stable algorithm to sample from a distribution for which we know only a certain (large) number of moments?
Asked
Active
Viewed 279 times
8
-
4How would that even be possible? You can't sample from an under-determined distribution. You can, of course, create distributions which e.g. have a given mean and variance, but there is no guarantee that the resulting samples would be from the distribution that you want. – John Coleman Nov 08 '17 at 21:42
-
1This paper seems relevant: http://fks.sk/~juro/docs/paper_spie_2.pdf – John Coleman Nov 08 '17 at 21:54
-
What if you have 20 moments, rather than 2? 100? 1000? – Frank Nov 08 '17 at 23:37
-
Also, Wikipedia says "for the case where X has a continuous probability density function $ƒ(x), M_x(−t)$ is the two-sided Laplace transform of $ƒ(x)$", which kind of gave me hope that his Laplace transform could be inverted. – Frank Nov 08 '17 at 23:39
-
Also "if two distributions have the same moment-generating function, then they are identical at almost all points", which I took to mean that the moments contain about as much information as the original distribution - but I could be wrong here. – Frank Nov 08 '17 at 23:43
-
There exist sets of distinctly different distributions having identical moments--of all orders. See https://stats.stackexchange.com/questions/25010/identity-of-moment-generating-functions/25017#25017. – whuber Nov 08 '17 at 23:51
-
Just an idea for the univariate case, if we know up to the $n$-th moments, then we may approximate the characteristic function by Taylor expansion $$\sum_{k=0}^n\frac{(it)^k}{k!}E[X^k],$$ given it is valid of course, then we apply Gil-Pelaez formula to recover the CDF, then do the inverse transform. – Francis Nov 08 '17 at 23:57
-
Just realized Luc Devroye's classical *Non-Uniform Random Variate Generation* has an entire section devoted to this, starting at p.682. – Francis Nov 09 '17 at 00:17
-
@whuber - there seems to be quite a good amount of subtlety here :-) – Frank Nov 09 '17 at 02:00
-
Low order moments -- even quite a few of them - don't generally pin down a distribution very well. – Glen_b Nov 09 '17 at 02:38
-
And high order moments - they can be so large that it's numerically very annoying :-( feels a bit hopeless. – Frank Nov 09 '17 at 04:10
-
Interesting question (and discussion)! Is there a specific application or background, or is this just curiosity on your part? – Stephan Kolassa Nov 09 '17 at 07:35
-
@StephanKolassa - at work, we need to classify distributions according to shape (for signal detection in security). I wanted to see how much information about a distribution's shape its moments contained, and for that I wanted to write a small program that would sample and plot a distribution given its moments. I would tell it: here are the first 10 moments, show me what the distribution looks like. Not really an application, more curiosity :-) – Frank Nov 09 '17 at 15:11
-
Hm. How about simply plotting histograms, or kernel density estimates/beanplots? If you have enough data to reliably estimate high moments, you should have enough data for such beanplots, shouldn't you? – Stephan Kolassa Nov 09 '17 at 15:36
-
@StephanKolassa - yes - I'm going to give up on "generating" from the moments, and simply compute the moments from different distributions we need to classify. We do have tons of data. – Frank Nov 09 '17 at 16:17
-
https://stats.stackexchange.com/questions/141652/constructing-a-continuous-distribution-to-match-m-moments?rq=1 – kjetil b halvorsen Jan 02 '21 at 15:38
1 Answers
0
If you have $k$ moments, one reasonable approach is to solve for the metalog distribution that has $k$ coefficients and those moments. (See https://en.wikipedia.org/wiki/Metalog_distribution#Moments for some formulas). Then the quantile function of that distribution has a particularly simple form in terms of those coefficients, and you can use that to sample the distribution.

Matt F.
- 1,656
- 4
- 20