10

Suppose $X_1,...,X_n$ are iid from $N(\mu,\sigma^2)$ and let $X_{(i)}$ denote the $i$'th smallest element from $X_1,...,X_n$. How would one be able to upper bound the expected maximum of the ratio between two consecutive elements in $X_{(i)}$? That is, how can you calculate an upperbound on:

$$E\left[\max\limits_{i=1,...,n-1}\left(\frac{X_{(i+1)}}{X_{(i)}}\right)\right]$$

The literature that I have been able to find is mostly focused on the ratio between two random variables which results in a ratio distribution for which the pdf for two uncorrelated normal distributions is given here: https://en.wikipedia.org/wiki/Ratio_distribution#Gaussian_ratio_distribution . While this would enable me to upperbound the expected average ratio of $n$ variables I can not see how to generalize this concept to finding the expected maximum ratio of $n$ variables.

Max
  • 103
  • 3
  • As whuber has noted below, the expectation of the ratio of two consecutive order stats does not converge. But if it did, or if you are interested in their difference, say $$E\left[\max\limits_{i=1,...,n-1}\left(X_{(i+1)} - X_{(i)} \right)\right]$$ ... the problem should in fact simplify to finding the ratio (or difference, as the case may be) of the two LARGEST order statistics i.e. $$E[X_{(n)} -X_{(n-1)}]$$ ... just from the shape of the Normal tails. – wolfies Jun 03 '16 at 17:02

1 Answers1

7

The expectation is undefined.

Let the $X_i$ be iid according to any distribution $F$ with the following property: there exists a positive number $h$ and a positive $\epsilon$ such that

$$F(x) - F(0) \ge h x\tag{1}$$

for all $0 \lt x \lt \epsilon$. This property is true of any continuous distribution, such a Normal distribution, whose density $f$ is continuous and nonzero at $0$, for then $F(x) - F(0) = f(0)x + o(x)$, allowing us to take for $h$ any fixed value between $0$ and $f(0)$.

To simplify the analysis I will also assume $F(0) \gt 0$ and $1-F(1) \gt 0$, both of which are true for all Normal distributions. (The latter can be assured by rescaling $F$ if necessary. The former is used only to permit a simple underestimate of a probability.)

Let $t \gt 1$ and let us overestimate the survival function of the ratio as

$$\eqalign{ \Pr\left(\frac{X_{(i+1)}}{X_{(i)}} \gt t\right) &= \Pr(X_{(i+1)} \gt t X_{(i)}) \\ &\gt \Pr(X_{(i+1)}\gt 1,\ X_{(i)} \le 1/t) \\ &\gt \Pr(X_{(i+1)}\gt 1,\ 1/t \ge X_{(i)} \gt 0,\ 0 \ge X_{(i-1)}).}$$

That latter probability is the chance that exactly $n-i$ of the $X_j$ exceed $1$, exactly one lies in the interval $(0,1/t]$, and the remaining $i-1$ (if any) are nonpositive. In terms of $F$ that chance is given by the multinomial expression

$$\binom{n}{n-i,1,i-1}(1-F(1))^{n-i}(F(1/t)-F(0))F(0)^{i-1}.$$

When $t \gt 1/\epsilon$, inequality $(1)$ provides a lower bound for this that is proportional to $1/t$, showing that

The survival function $S(t)$ of $X_{(i+1)}/X_{(i)}$, has a tail behaving asymptotically as $1/t$: that is, $S(t) = a/t + o(1/t)$ for some positive number $a$.

By definition, the expectation of any random variable is the expectation of its positive part $\max(X,0)$ plus the expectation of its negative part $-\max(-X,0)$. Since the positive part of the expectation--if it exists--is the integral of the survival function (from $0$ to $\infty$) and

$$\int_0^x S(t) dt = \int_0^x (1/t + o(1/t))dt\; \propto\; \log(x),$$

the positive part of the expectation of $X_{(i+1)}/X_{(i)}$ diverges.

The same argument applied to the variables $-X_i$ shows the negative part of the expectation diverges. Thus, the expectation of the ratio isn't even infinite: it is undefined.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • 2
    +1 I was just trying a 'simple' $n = 3$ case myself, and tried evaluating the expectations ... and came to the same conclusion: that the expectation integral does not converge. Perhaps the OP will re-cast the question in a different form, such as differences rather than ratios – wolfies Jun 03 '16 at 16:43