1

What is an asymptotics of the expectation of the order statistics of the standard normal distribution $$e(r:n) \approx \Phi^{-1}\Big(\frac{r-\alpha}{n-2\alpha+1}\Big)$$ as $n\rightarrow\infty$?

By asymptotics, it is usually understood that the sought-after function is a "simpler" and more "elementary" function, usually either algebraic or transcendental functions like exponential, logarithmic, or trigonometric function, than the one in question.

It is obvious that an asymptotics of this question has to approach $-\infty$ slower than $-\sqrt{\ln m}$ (here this upper bound is the composite of the "elementary" functions of logarithm and square root) where $m:=\frac{n-2\alpha+1}{r-\alpha}$ as $m\rightarrow\infty$.

Hans
  • 855
  • 4
  • 14
  • Since the empirical distribution function converges uniformly with probability 1 to the underlying distribution, it follows that the expectations of the order statistics converge to the true quantiles of the distribution. https://en.wikipedia.org/wiki/Empirical_distribution_function helps out here. – jbowman Mar 22 '18 at 20:06
  • @jbowman: I do not understand how what you are saying relates to my question. I am searching for an asymptotic formula for $e(r:n)$ as a function of $n$. – Hans Mar 22 '18 at 20:17
  • 1
    Did you try looking at this question https://stats.stackexchange.com/questions/9001/approximate-order-statistics-for-normal-random-variables?noredirect=1&lq=1, which seems essentially identical? ... also, you should explicitly ask for a formula that's a function of $n$, just to clarify, otherwise people like me might interpret "asymptotics" as meaning "what does it converge to and how (in probability...)". – jbowman Mar 22 '18 at 20:29
  • @jbowman: That link you cited is in my question in the first place to explain the motivation of the question. I do not see how the asymptotics could be confused when there is no probability mentioned. Probabilistic convergence is but one special subset of convergence. – Hans Mar 22 '18 at 20:36
  • I believe you might not have formulated the question you would like to ask, because it has a trivial answer: if you fix $r$ and let $n$ grow, as your notation indicates, then (obviously) $e$ diverges to $-\infty$ because its argument approaches $0$ from above. – whuber Mar 22 '18 at 21:09
  • @whuber: Well, I am not asking for the LIMIT but an ASYMPTOTICS. An asymptotics is defined as in this wikipedia article https://en.wikipedia.org/wiki/Asymptotic_analysis as a function the ratio of which over the original function approaches $1$ when the variable of concern approaches some value. So my question is well defined. – Hans Mar 22 '18 at 21:17
  • Yes, it's well defined--but even in the sense of asymptotics it's trivial. The argument is well approximated by $r/n$, whence $e(r:n)$ behaves like the Normal percentage point function $\Phi^{-1}$. (For a rigorous treatment, apply Taylor's Theorem with remainder with an expansion at $-\infty.$) If you're looking for some approximation of a particular form, then you need to specify it. – whuber Mar 22 '18 at 22:21
  • @whuber: Yes, the definition allows for the function to be itself $\Phi^{-1}$ --- your function is just that --- and the asymptotics is not unique either. I am well aware of these and that is why I write "an" instead of "the" asymptotics. It is usually understood that when asked for an asymptotics, it is some function that is "simpler" or more "elementary" than the original function. The Taylor theorem does not work here, since the $\Phi$ has an essential singularity around $\infty$. Try it. I have already added an upper bound for the speed of the function approaching $-\infty$. – Hans Mar 22 '18 at 22:44
  • 1
    @whuber: By "simple" and "elementary", I mean some algebraic or "simple" transcendental functions like exponential, logrithmic and trigonometric functions. I can add these descriptions in the question. – Hans Mar 22 '18 at 22:47
  • Yes, of course you're right: $\Phi$ has an essential singularity. But as you suggest earlier, its *logarithm* does not. After all, Mills' ratio suggests its log ought to be quadratic. Perhaps, then, your question is answered at https://stats.stackexchange.com/a/7206/919? – whuber Mar 23 '18 at 13:15
  • @whuber: Thank you for the link. It does not answer my question, but does provide an ingredient. In fact, I myself have derived that relationship by integration by parts without knowing the quoted result. That was how I obtained the upper bound in my question in the first place. In fact, I have already solved the problem, with this question which I solved myself yesterday https://math.stackexchange.com/a/2704405/64809 which confirmed my upper bound and provided finer bounds. I just have not had time to put the two pieces together and write up the complete answer here yet. I will do so soon. – Hans Mar 23 '18 at 16:24
  • @whuber: I have finished the answer. Check it out if you would like. – Hans Mar 26 '18 at 07:25

1 Answers1

1

For convenience sake, we make everything positive. So we look at the $r$'th largest random variable. Instead of $\Phi$ we look at $1-\Phi$ $$1-\Phi(x)=\frac1{\sqrt{2\pi}}\int_x^\infty e^{-\frac{t^2}2}\,dt=e^{-\frac{x^2}2}\frac1x-\int_x^\infty \frac{e^{-\frac{t^2}2}}{t^3}\,d\Big(\frac{t^2}2\Big) \tag1$$ as $x\rightarrow\infty$ by integration by parts.

$$\int_x^\infty \frac{e^{-\frac{t^2}2}}{t^3}\,d\Big(\frac{t^2}2\Big)>\frac{e^{-\frac{x^2}2}}{x^3},$$ so $$\frac1x\Big(1-\frac1{x^2}\Big)<e^{\frac{x^2}2}(1-\Phi(x))<\frac1x \tag2$$ as $\delta>\frac1{x^2}$ for some small positive $\delta$ and large enough $y$ and thus $x$. We have $$\frac1x(1-\delta)<e^{\frac{x^2}2}(1-\Phi(x))<\frac1x \tag3$$ For $\frac1y=1-\Phi(x)$ and large $y$, from Equation (1), we know $x>1$.

Take logarithm of (3). $$\sqrt{2(\ln y-\ln x+\ln(1-\delta))}<x<\sqrt{2(\ln y-\ln x)}<\sqrt{2\ln y}. \tag4$$ Taking logarithm on the inequality to the right of $x$ gives $$\ln x < \frac12(\ln\ln y+\ln 2)$$ Substitute the above back into the inequality left to $x$ of (4), $$\sqrt{2\ln y-\ln\ln y+2\ln\frac{1-\delta}{\sqrt2}}<x<\sqrt{2\ln y}$$ Then substitute the left inequality above back into the right inequality of (4).

We can continue this alternating substitution of the inequalities ad infinitum.


We can also resort to using the Lambert W function and give a more direct answer.

Hans
  • 855
  • 4
  • 14