1

I am looking at a certain proof of asymptotic normality of M-estimators and there is a matrix $V$ which is known to be invertible anda random variable $X_n = V + o_p(1)$ (that is, the difference converges to zero in probability as $n \to \infty$). I want to take the inverse of X_n here. The proof goes that $X_n^{-1} = V^{-1} + o_p(1)$ by a continuous mapping theorem for convergence in probability.

This step is fine, but my question is how we can justify the fact that X_n can be invertible for large n. I know that if a matrix is close enough to an invertible matrix it will be invertible itself because the determinant is a continuous function, but I am unable, to come up with an extension of the continuous mapping theorem that allows for X to be in the domain only with arbitrarily high probability (but not Probability 1). I am not sure if that is enough for the rest of the proof.

Here is an image of th statement and proof and my issue is at the end.

Here is an image of the proof

0 Answers0