9

I am wondering about showing that the limit: $$ \lim_{x \to \infty} x\overline{F}(x) =0 $$ where $\overline{F} =1-F$ is the tail distribution function, $\overline{F}(x)=1−F(x)$, where $F$ is the cumulative distribution function

As $x \to \infty$, $\overline{F} \to 0$, so we have indeterminate form, I rewrite as: $$ \lim_{x \to \infty} \frac{\overline{F}(x)}{1/x} $$ and use L'Hôpital's rule: $$ \lim_{x \to \infty} \frac{f(x)}{1/x^2} $$ but this requires knowledge of $f$ as $x \to \infty$ which I don't have.

How do I evaluate this limit?

JohnK
  • 18,298
  • 10
  • 60
  • 103
WeakLearner
  • 1,013
  • 1
  • 12
  • 23
  • 1
    You should clarify your assumptions: the claimed result is not true in general (e.g. for Pareto), but holds when $X$ is positive $\mathbb{E}[X] < \infty$. Hint: use $x \text{Pr}\{ X > x \} \leq \mathbb{E}[X 1_{ \{X > x\} }]$. – Yves Nov 10 '15 at 13:24
  • @Solitary nitpicking a little bit, but the condition is actually slightly weaker requiring integrability. For example, one can show $x^p \Pr\{|X| > x\} \to 0$ implies $E[|X|^q] < \infty$ for all $q$ *strictly* less than $p$. But it is not true for $q = p$ in general. Off the top of my head, I think the density proportional to $1 / [x^{p+1} \log x]$ for $x > 2$ gives the counterexample, but I confess that I haven't done the math. – guy Nov 10 '15 at 15:30
  • This is proven in a [paper with a silly name, the darth vader rule](https://www.sav.sk/journals/uploads/1030150905-M-O-W.pdf) on page 2. This paper isn't about your question exactly, but they do answer your question in it. – RayVelcoro Nov 10 '15 at 21:25

2 Answers2

12

Assuming that the expectation exists and for convenience that the random variable has a density (equivalently that it is absolutely continuous with respect to the Lebesgue measure), we are going to show that

$$\lim_{x\to\infty} x \left [1-F(x)\right]=0$$

The existence of the expectation implies that the distribution is not very fat-tailed, unlike the Cauchy distribution for instance.

Since the expectation exists, we have that

$$E(X)=\lim_{u\to \infty} \int_{-\infty}^u xf(x) \mathrm{dx} = \int_{-\infty}^{\infty} x f(x) \mathrm{dx} < \infty$$

and this is always well-defined. Now note that for $u \geq 0$,

$$\int_{u}^{\infty} x f(x) \mathrm{dx} \geq u \int_{u}^{\infty} f(x) \mathrm{dx} = u \left[1-F(u) \right]$$

and from these two it follows that

$$\lim_{u \to \infty} \left[ E(X) - \int_{-\infty}^u xf(x) \mathrm{dx} \right] = \lim_{u\to \infty} \int_{u}^{\infty} x f(x) \mathrm{dx}=0$$

as in the limit the term $\int_{-\infty}^u xf(x) \mathrm{dx}$ approaches the expectation. By our inequality and the nonnonegativity of the integrand then, we have our result.

Hope this helps.

JohnK
  • 18,298
  • 10
  • 60
  • 103
  • 6
    Thank you (+1). Re relaxing the assumption: when, for instance, $F$ is a Cauchy distribution, then the limiting value of $x(1-F(x))$ is $1/\pi$, not zero. For Student $t$ distributions with parameter less than $1$ ($1$ denotes the Cauchy), this limit is infinite. – whuber Nov 10 '15 at 14:23
5

For any nonnegative random variable $Y$ , we have (see (21.9) of Billingsley's Probability and measure): $$E[Y] = \int Y dP = \int_0^\infty P[Y > t] dt. \tag{$*$}$$

For $M > 0$, substituting $Y$ in $(*)$ with $XI_{[X > M]}$ to \begin{align} \int XI_{[X > M]} dP &= \int_0^\infty P[XI_{[X > M]} > t]dt \\ &= \int_0^M P[XI_{[X > M]} > t]dt + \int_M^\infty P[XI_{[X > M]} > t]dt \\ &= MP[X > M] + \int_M^\infty P[X > t] dt \geq MP[X > M]. \tag{$**$} \end{align}

The third equality holds because for every $t \in [0, M]$, $\{XI_{[X > M]} > t\} = \{X > M\}$, and for every $t > M$, $\{XI_{[X > M]} > t\} = \{X > t\}$. To wit, say, if we want to show for every $t \in [0, M]$, $\{XI_{[X > M]} > t\} = \{X > M\}$, just note \begin{align} \{XI_{[X > M]} > t\} &= (\{XI_{[X > M]} > t\} \cap \{X > M\}) \cup (\{XI_{[X > M]} > t\} \cap \{X \leq M\}) \\ &= (\{X > t\} \cap \{X > M\}) \cup (\{0 > t\} \cap \{X \leq M\}) \\ &= \{X > M\} \cup \varnothing = \{X > M\}. \end{align}

Similarly it is easy to show for every $t > M$, $\{XI_{[X > M]} > t\} = \{X > t\}$.

Assume that $X$ is integrable (i.e., $E[|X|]< \infty$), then the left hand side of $(**)$ converges to $0$ as $M \to \infty$, by the dominated convergence theorem. It then follows that $$0 \geq \limsup_{M \to \infty} MP[X > M] \geq \liminf_{M \to \infty} MP[X > M] \geq 0.$$ Hence the result follows.

Remark: This proof uses some measure theory, which I think is worthwhile as the proof assuming the existence of densities doesn't address a majority class of random variables, for example, discrete random variables such as binomial and Poisson.

Zhanxiong
  • 5,052
  • 21
  • 24
  • 4
    The proof does not really require that $X$ is integrable, but only that $X 1_{ \{X > x_0\}}$ be such for some finite $x_0$, hence $X$ can have an heavy left tail. The identity from Billingsley's book is not really needed else since $X 1_{ \{X > x\}}$ tends to $0$ for $x \to \infty$ with probability one. – Yves Nov 10 '15 at 15:58
  • @Yves@guy Yes, good point. Integrability is just one sufficient condition but never a necessary one. However, it might be the most succinct and normal condition imposed to derive the relation asked by OP. – Zhanxiong Nov 10 '15 at 16:13
  • OK. Succinct alternative: $\mathbb{E}(X_+) < \infty$. – Yves Nov 10 '15 at 16:27
  • @Yves Of course :) – Zhanxiong Nov 10 '15 at 16:45
  • @Zhanxiong Could you explain the first equality on line $(**)$ in a bit more detail? I don't see how it follows from $(*)$. (However, it doesn't seem like you need this equality, as the inequality on line $(**)$ is clear without the middle term.) – zxmkn Dec 07 '21 at 16:44
  • @zxmkn You're absolutely right. The equality should be just an inequality. I added more details. – Zhanxiong Dec 07 '21 at 19:47
  • @Zhanxiong Ahh, now it makes sense. Thanks! I think you could simplify this proof, though, as follows: $\int X I_{[X>M]} dP = \int_{\{X > M\}} X I_{[X>M]} dP + \int_{\{X \leq M\}} X I_{[X>M]} dP$. Therefore, $\int X I_{[X>M]} dP \geq M \int I_{[X>M]} \, dP = M \cdot P[X>M]$. (I think it's a bit clearer and you wouldn't need to use Billinglsey's lemma. Just an alternative, though.) – zxmkn Dec 08 '21 at 08:53
  • @zxmkn Sorry for the confusion --- after verifying the equation in Billingsley's book (page 275, Eq (21.10)), I found it is still a strict equality. And I added more details to prove it. To prove the inequality, your alternative is of course easier. – Zhanxiong Dec 08 '21 at 13:47