I am trying to grapple with the following problem.
I have an application that develops empirical distributions. In essence, I end up with a histogram of equally spaced $x$ values, with both a $max$ and a $min$, and the probability for each bucket. What I need to do is 'compress' it for storage (there may be 1,000 to 10,000 buckets) so that the distribution can be recreated with reasonable accuracy.
The obvious choices are:
find a Taylor series approximation
find a Fourier transform approximation
- find a moment-generating approximation
Obviously with #1 one only needs convergences between the $max$ and $min$. With #2, one would model the distribution as periodic, with one period occurring between $min(x)$ and $max(x)$. In both cases it is obvious how to recover the distribution based on the very definition of the Taylor series and Fourier transform.
My question is this: if one went the route of #3, what is the 'inversion' formula that recovers a generalized functional representation of a distribution based on the first n integral moments being known?