0

I wonder if there is a complete list of regularity conditions for MLE asymptotic normality.

I read this post and found a list of 6 conditions but the answer does not include any reference. I read the book "Approximation theorems of mathematical statistics" and found only 3 conditions. I read other books as well but no book has a complete list so far.

EDIT: I am interested in the multidimensional parametric case.

I am hoping to find a complete list. Thank you very much for your help!

wut
  • 55
  • 6

1 Answers1

4

The post you are referencing lists conditions to derive the asymptotic distribution of a one-parameter likelihood ratio. The issue with coming up with a list of conditions is that often we start with primitive conditions for a result to hold and slowly generalize to weaker conditions that together imply the primitive conditions. So it is possible to have several lists of what at first glance may look like different conditions but truly boil down to the same primitives. Another issue with MLE is the level of generality you want to approach at. Are you concerned just with parametric MLE? Or do you want to look at all M-Estimators? What about nonparametric MLE like sieve estimators that nest traditional MLE? Different approaches will lead to different assumptions.

Going back to your question, I think the most natural approach to the conditions for asymptotic normality of MLE is given by the M-estimator approach. I wrote an answer covering these assumptions. I give 6 primitive conditions and later discuss other conditions that imply these. The conditions can be broken down into two groups: assumptions on the criterion and assumptions on the estimator.

I suggest reading Van der Vaart Chapter 5 for details and proofs related to M-estiamtors.

Ariel
  • 2,273
  • 2
  • 23
  • Thank you very much for your answer. I have some thoughts and responses as follows. First, I noticed in the answer you provided you only mentioned 4 conditions instead of 6. Second, tbh I am not familiar with M-estimators. I read van der vaart book but there is no definition of extremum estimator as you wrote in the referenced answer. I wonder if there is a reference for that. Third, I am interested in the multivariate parametric case. I really hope you could help. Thank you! – wut Jun 18 '21 at 03:34
  • 1
    Please read my answer carefully, there are indeed six assumptions given. The first four refer to the criterion function, the next two are not numbered but are on the estimator. I also give a definition of M-estimators at the very beginning of my answer. Van der Vaart provides an equivalent definition on pg. 45. Finally, parametric MLE is nicely subsumed by M-estimators my answer shows how this is done. Example 5.3 in Van der Vaart gives the same procedure. – Ariel Jun 18 '21 at 03:51
  • Thank you very much for your comment. I am sorry I skimmed the book and the comment through. I still have another problem. I get confused when you talked about assumption 5. You wrote "Indeed, it should be clear that asymptotic normality implies consistency", but then you use this assumption for the proof of asymptotic normality. Finally could you explain a bit more about Assumption 4? – wut Jun 18 '21 at 04:03
  • Asymptotic normality implies consistency but consistency does not imply asymptomatic normality. So we need consistency of our estimator if we hope to prove that it is asymptomatically normal. Consistency is often implied in this case with a law of large numbers. Assumption 4 gives us a way to estimate the second derivative of the criterion. The proof of asymptomatic normality will require a second order taylor expansion and so this assumption gives us control of this final term. – Ariel Jun 18 '21 at 04:10
  • Thank you. But I am still confused about why you come from the "asymptotic normality implies consistency" to require consistency for asymptotic normality proof. And for assumption 4, could you please explain why $H_0$ needs to be invertible and $H(\theta)$ needs to be continous? – wut Jun 18 '21 at 04:25
  • This is a basic fact of asymptotic theory. Here is a proof: https://math.stackexchange.com/q/409429/606804 – Ariel Jun 18 '21 at 04:29
  • 1
    The additional assumptions you mention can be used to imply condition 4. The need for invertability should be clear from the asymptotic variance. I would recommend you work through a proof of asymptotic normality of M-estimators to make the need for these assumptions concrete. You should be able to find several proofs online, these notes in particular might be helpful: http://web.stanford.edu/~doubleh/lecturenotes/lecture2.pdf – Ariel Jun 18 '21 at 04:31
  • 2
    It might also be worth pointing out more explicitly that these are *sufficient* conditions, and are not trying to be *necessary* conditions (except in the sense that they're necessary for a particular proof approach). For example, the MLE for the Laplace distribution is consistent and asymptotically normal despite the loglikelihood not being differentiable at $\theta_0$. (I don't know of any useful set of necessary conditions) – Thomas Lumley Jun 18 '21 at 05:31
  • Thank you very much @ThomasLumley yes it makes better sense to think about _sufficient_ condition for the mentioned assumption of consistency. Ariel, Thank you for your help. Though I am still very confused, I will try to have a better read for the M-estimator. – wut Jun 18 '21 at 12:23