I am trying to understand “higher order asymptotics”. I find several texts on Likelihood asymptotics, nothing’s easy to read... if you have any nice pointers on this direction, I’ll be interested; however my main question follows. The following ”roadmap“ to higher order asymptotics seems natural to me, and seems something to do and understand before looking at likelihood theory, but I don’t find anything on these lines:
Consider $T = f( X )$ where $X$ is a random vector of known distribution; let's say, for simplicity, that $X$ is normal, or multinomial (or the concatenation of several independent multinomials). The Delta method tells me that asymptotically, $T$ is normal, and how to compute its mean and variance (using linear approximation of $f$). I think one can find a better approximation of the distribution of $T$ by
- computing the first moments of $T$ (using linear, or quadratic, or higher order approximations of $f$)
- ”finding“ a distribution with the same first moments plus some other “parsimony” criteria to ensure uniqueness (”finding” is not well-defined: at least, being able to evaluate it numerically)
Is that possible? Do you know any textbooks/lecture/article going in that direction?
Edit fg nu gave me some pointers for the second step, that lead me to Edgeworth series. A few references:
- Rothenberg 1984 (given by fg nu)
- Blinnikov 1998 (from the wikipedia article)
- Chapter 16 of Feller’s Introduction to Probability
The first step is rather elementary, however any good pointer is still appreciated.