I am having trouble figuring out how to work with convergence in probability questions. I will give a self-study example from Casella and Berger's Statistical Inference Below.
Let $X_{1}...X_{n}$ be a sequence of random variables that converges in probability to a constant a. Assume that $\mathbb{P}(X_i > 0 ) = 1$ for all i. Verify that the sequences defined by $Y_{i}^{'} = a/X_{i}$ converge in probability.
I will give the solution to this problem below and then point out where I am having trouble.
$\mathbb{P}(|a/X_{i} - 1| \leq \epsilon)$ $=$ $\mathbb{P}({{a}\over{1+\epsilon}} \leq X_{i} \leq {{a}\over{1 - \epsilon}})$ = $\mathbb{P}(a-{{a\epsilon}\over{1 + \epsilon}} \leq X_{i} \leq a + {{a\epsilon}\over{1 - \epsilon}})$
( ** In the step above, why is 1 chosen in the expression $(|a/X_{i} - 1|)$? Also, why is $(a - {{a\epsilon}\over{1 + \epsilon}})$ used? I know that since we know $X_{n} \longrightarrow a$ in probability, we're probably trying to rewrite the expression in such a way to use that fact.... but I'm not sure how to just "pull it out of my hat" essentially; ie, in a testing situation. ** )
Use $(1 + \epsilon)^{-1} < (1 - \epsilon)^{-1}$ and continue
$\mathbb{P}(|Y_{i} - a^{1/2}| > \epsilon)$ $\geq$ $\mathbb{P}(a-{{a\epsilon}\over{1 + \epsilon}}$ $\leq$ $a + {{a\epsilon}\over{1 - \epsilon}}) = \mathbb{P}(|X_{i} - a|$ $\leq$ $\epsilon{{a}\over{1 + \epsilon}})$ $\longrightarrow 1$ as i $\longrightarrow$ $\infty$
( ** In the above, why is the constant being used now $a^{1/2}$ ?? )
Is there some better, more systematic way to approach these problems? I want to be able to encounter any convergence problem and instantly know the steps to employ, rather than try to rewrite the problem in a tricky way.