3

I am having trouble figuring out how to work with convergence in probability questions. I will give a self-study example from Casella and Berger's Statistical Inference Below.

Let $X_{1}...X_{n}$ be a sequence of random variables that converges in probability to a constant a. Assume that $\mathbb{P}(X_i > 0 ) = 1$ for all i. Verify that the sequences defined by $Y_{i}^{'} = a/X_{i}$ converge in probability.

I will give the solution to this problem below and then point out where I am having trouble.

$\mathbb{P}(|a/X_{i} - 1| \leq \epsilon)$ $=$ $\mathbb{P}({{a}\over{1+\epsilon}} \leq X_{i} \leq {{a}\over{1 - \epsilon}})$ = $\mathbb{P}(a-{{a\epsilon}\over{1 + \epsilon}} \leq X_{i} \leq a + {{a\epsilon}\over{1 - \epsilon}})$

( ** In the step above, why is 1 chosen in the expression $(|a/X_{i} - 1|)$? Also, why is $(a - {{a\epsilon}\over{1 + \epsilon}})$ used? I know that since we know $X_{n} \longrightarrow a$ in probability, we're probably trying to rewrite the expression in such a way to use that fact.... but I'm not sure how to just "pull it out of my hat" essentially; ie, in a testing situation. ** )

Use $(1 + \epsilon)^{-1} < (1 - \epsilon)^{-1}$ and continue

$\mathbb{P}(|Y_{i} - a^{1/2}| > \epsilon)$ $\geq$ $\mathbb{P}(a-{{a\epsilon}\over{1 + \epsilon}}$ $\leq$ $a + {{a\epsilon}\over{1 - \epsilon}}) = \mathbb{P}(|X_{i} - a|$ $\leq$ $\epsilon{{a}\over{1 + \epsilon}})$ $\longrightarrow 1$ as i $\longrightarrow$ $\infty$

( ** In the above, why is the constant being used now $a^{1/2}$ ?? )

Is there some better, more systematic way to approach these problems? I want to be able to encounter any convergence problem and instantly know the steps to employ, rather than try to rewrite the problem in a tricky way.

Tim
  • 108,699
  • 20
  • 212
  • 390
LotsofQuestions
  • 491
  • 2
  • 10
  • 2
    In the chapter of this book look at the definition of convergence in probability... there are two forms of the definition (this is to help figure out why 1 is chosen in the step). I would caution in using the solution manual for this book (there are errors or sometimes it is hard to follow). – Lauren Goodwin Sep 14 '15 at 18:50
  • According to the definition from the book, a sequence of random Variables $X_{1} ... X_{n}$ converges in probability to a random variable X, if for every $\epsilon > 0$ $\lim_{n\to\infty} \mathbb{P}(|X_{i} - X| < \epsilon) = 1$ So, using this definition, it still isn't quite clear why 1 is chosen.... is it that the sequence of $a/X_{n} itself gets close to 1 as n goes to infinity? Not sure how to establish that. – LotsofQuestions Sep 14 '15 at 19:08
  • Since $X_{i}$ converges to $a$, then $(a/X_{i})$ converges to $1$? Ie, ${{a}\over{a}}=1$? – LotsofQuestions Sep 14 '15 at 19:10
  • Alright. That was less of a shot in the dark than I was making it out to be... So to check my understanding. Suppose that $Y_{i}=X^{1/2}$ Then since $X_{n}$ converges to a, I'd have to write $\mathbb{P}(|X_{i}^{1/2} - a^{1/2}| < \epsilon) = 1$ to get the convergence constant right? Ie, putting it sort-of in the same terms. – LotsofQuestions Sep 14 '15 at 19:17
  • I understand that it's because of $(1 + \epsilon)^{-1} < (1 - \epsilon)^{-1}$, but why is rewriting it in the form $\mathbb{P}\left(a-{{a\epsilon}\over{1 + \epsilon}} \leq X_{i} \leq a + {{a\epsilon}\over{1 - \epsilon}}\right)\ge\mathbb{P}\left(a-{{a\epsilon}\over{1 + \epsilon}} \leq X_{i} \leq a + {{a\epsilon}\over{1 + \epsilon}}\right)$ by remove the $1-\epsilon$ important to the proof? Because it fixes the term $\epsilon$, making it a single value that X is within? So then you can say that $X_{n}$ converges and is within that value containing $\epsilon$ , ${{a\epsilon}\over{1+\epsilon}}$ – LotsofQuestions Sep 14 '15 at 19:29
  • By the way - thanks so much! This is clearer. I think I am understanding it better. – LotsofQuestions Sep 14 '15 at 19:36
  • 1
    Are there two questions muddled together here? (i.e. $Y_i=\sqrt{X_i} \to \sqrt{a}$ and $Y'_i = a/X_i \to 1$?) – P.Windridge Sep 14 '15 at 20:53

1 Answers1

2

To sum up, now that you have gone through the steps:

  1. As $(X_i)$ converges in probability to $a$, then $(X_i/a)$ converges in probability to $1$ since$$\mathbb{P}(|X_i/a-1|<\epsilon)=\mathbb{P}(|X_i-a|<\epsilon\times a)=\mathbb{P}(|X_i-a|<\epsilon^\prime),$$meaning that the first term goes to $1$ as $i$ grows to infinity. Hence,$$\mathbb{P}\left(|X_{i}/a - 1| < \frac{\epsilon}{1+\epsilon}\right)=\mathbb{P}(|X_i-a|<\epsilon^{\prime\prime})$$ goes to $1$ as $i$ grows to infinity for every $\epsilon>0$.
  2. It thus makes sense to check whether or not $(a/X_i)$ converges in probability to $1$. If the sequence converges to $b$, it can only be $b=1$.
  3. By definition, $(a/X_i)$ converges in probability to $1$ if $$\mathbb{P}(|a/X_{i} - 1| < \epsilon)$$ goes to $1$ for every $\epsilon>0$.
  4. The equation$$\mathbb{P}(|a/X_{i} - 1| < \epsilon)=\mathbb{P}\left(a-{{a\epsilon}\over{1 + \epsilon}} < X_{i} < a + {{a\epsilon}\over{1 - \epsilon}}\right)$$holds for all $\epsilon$'s.
  5. Since $$\mathbb{P}\left(a-{{a\epsilon}\over{1 + \epsilon}} < X_{i} < a + {{a\epsilon}\over{1 - \epsilon}}\right)\ge\mathbb{P}\left(a-{{a\epsilon}\over{1 + \epsilon}} < X_{i} < a + {{a\epsilon}\over{1 + \epsilon}}\right)$$and$$\mathbb{P}\left(a-{{a\epsilon}\over{1 + \epsilon}} < X_{i} < a + {{a\epsilon}\over{1 + \epsilon}}\right)=\mathbb{P}\left(\left|X_{i}- a \right|< {{a\epsilon}\over{1 + \epsilon}}\right),$$we are back at the convergence in probability of $X_i$ to $a$: the last term goes to $1$ as $i$ grows to infinity for all $\epsilon$'s and hence the larger term$$\mathbb{P}(|a/X_{i} - 1| < \epsilon)$$also goes to $1$ as $i$ grows to infinity for all $\epsilon$'s.
Xi'an
  • 90,397
  • 9
  • 157
  • 575
  • Not to beat a dead horse... but after seeing this, I'm still a little confused in one area. For the problem, it is specified that we're trying to see if $a/X_{I}$ converges in probability. I think you mistakenly wrote $(X_i/a)$ in the beginning of the proof. Wouldn't this change the flow of things? For example, $\mathbb{P}(|X_i/a-1| – LotsofQuestions Sep 15 '15 at 20:16
  • Well I worked the problem out step by step this way... $\mathbb{P}(|a/X_{i} - 1| < \epsilon) = \mathbb{P}(|X_{i} - 1/a| < \epsilon/a)$ $=$ $\mathbb{P}(a/(\epsilon+a) < X_{I} < a/(\epsilon-1)$ Then since we know that since $X$ converges to $a$ in probability, and that that $|X-a| – LotsofQuestions Sep 15 '15 at 20:51
  • 1. I did not mistakenly wrote $X_i/a$ for $a/X_i$, my first entry was to explain why you would consider $a/X_i$ converging to 1. I added an extra sentence to this entry.$${ }$$ 2. Your starting equation $$ℙ(|a/X_i−1| – Xi'an Sep 16 '15 at 09:09
  • Well I understand that it's just another $\epsilon$, but I'm not sure how why you'd even make the statement in step 1 that $\mathbb{P}\left(|X_{i}/a - 1| < \frac{a\epsilon}{1+\epsilon}\right)=\mathbb{P}(|X_i-a| – LotsofQuestions Sep 16 '15 at 17:54
  • The $a\epsilon/(1+\epsilon)$ that I now corrected into $\epsilon/(1+\epsilon)$ comes from part 4, not of nowhere, and I put this extra explanation in part 1, because it is a direct consequence of the convergence of probability of $X_i$ to $a$. I am clearly running out of explanations for this problem..! – Xi'an Sep 16 '15 at 18:42