3

Let $X_1,X_2,X_3,...$ be $i.i.d.$ with finite mean $\mu$.

Then let $Y_i=X_i1_{\{|X_i|<i\}}$

There will be only finitely many terms such that $Y_i\neq X_i$

While that lecture notes did not provide proof, it hinted it is related to Borel–Cantelli.

What is the proof for the statement

Preston Lui
  • 509
  • 2
  • 8
  • What is $k$? After all, when $k=0$ (or, more generally, when $\Pr(|X_i|\lt k)=0$) the statement evidently can be false, so there must be some kind of restriction on $k.$ – whuber Jun 09 '20 at 13:53
  • Sorry, that is a mistype, there is no k, there is only i – Preston Lui Jun 09 '20 at 14:08
  • 1
    You can find the proof in any strong law proofs of which there’s a ton in literature and textbooks, e.g. see p 57 here http://math.mit.edu/~sheffield/2016175/Lecture6.pdf – Aksakal Jun 09 '20 at 14:21
  • 1
    @Aksakal Indeed, and this is the first lemma used for the general proof, which I don't know how to prove – Preston Lui Jun 09 '20 at 15:19

1 Answers1

1

If you are going to apply the Borel-Cantelli lemma, that means you expect to show that the sum of all the chances of the events $Y_i\ne X_i$ is finite. Somehow this must be derivable from the assumptions--and about the only useful assumption available is that the common distribution has finite mean. What is the connection between these statements?

Notice that $Y_i \ne X_i$ is equivalent to $|X_i| \ge i.$ There's no problem working with $|X_i|$ instead of $X_i$ because

  1. The independence of the $X_i$ implies the independence of the $|X_i|.$

  2. Because the $X_i$ are identically distributed, the $|X_i|$ are identically distributed.

  3. The mean of $X_i$ is defined and finite if and only if the mean of $|X_i|$ is defined and finite.

What's the connection between the mean of a variable and the chances that it's large? The answer lies in the "tail probability expectation formula" (see the reference or consult Expectation of a function of a random variable from CDF),

$$\mathbb{E}(|X_j|) = \int_0^\infty \Pr(|X_j| \gt x)\,\mathrm{d}x$$

for any $j.$

To relate this to $\Pr(|X_i|\ge i)$ we can break this integral into pieces at the integers $i=1,2,3,\ldots$ and underestimate its argument a little in each piece because the $X_i$ are identically distributed. Specifically, for any $x\lt i,$

$$\Pr(|X_j| \gt x) \ge \Pr(|X_j| \ge i) = \Pr(X_i \ge i).$$

This is the key step in the following derivation, which begins with observation $(3)$ that $|X_j|$ has finite expectation:

$$\eqalign{ \infty \gt \int_0^\infty \Pr(|X_j| \gt x)\,\mathrm{d}x &= \sum_{i=1}^\infty \int_{i-1}^i \Pr(|X_j| \gt x)\,\mathrm{d}x \\ &\ge\sum_{i=1}^\infty \int_{i-1}^i \Pr(|X_i| \ge i)\,\mathrm{d}x \\ &= \sum_{i=1}^\infty \Pr(|X_i| \ge i)\,\int_{i-1}^i \mathrm{d}x \\ &= \sum_{i=1}^\infty \Pr(|X_i| \ge i). }$$

That's the condition for applying Borel-Cantelli, QED.

Reference

Ambrose Lo, Demystifying the Integrated Tail Probability Expectation Formula. The American Statistician Volume 73, Number 4, November 2019, pp 367-374.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • Thanks, to be precise so it also works for discrete RV, the inequality can also be rewritten as $\geq \sum_{i=1}^{+\infty}Pr(|X_i|\geq i)$ – Preston Lui Jun 09 '20 at 15:57
  • Thank you. I have taken care of that in an edited version of the derivation. – whuber Jun 09 '20 at 16:28