0

This is an understanding question regarding the method moments: It's basic idea is that any $k^{th}$ moment $m_{k}$ converges in probability to its empiric estimator as follows:

$$m_{k} = \frac{1}{n} \sum_{i=1}^{n} X_{i}^k \rightarrow^{p} E(X_{i}^k) = \mu_{k}(\theta)$$

where $X_{i}$ is the random variable. (How can I display the convergence better? I would edit it if someone has a clue)

Now my question is simple: Why is that true? Can someone prove this?

My best guess was to use the continuous mapping theorem, but what continuous function g(x) would describe this mapping? My idea was probably a function that "squares every value $X_i$", but I am not sure whether this is a continuous function, neither I know whether this is true in general.

Michael R. Chernick
  • 39,640
  • 28
  • 74
  • 143
Peter Series
  • 419
  • 1
  • 4
  • 11
  • 1
    I have changed your question because it is $X^k_i$ that converges to $E(X^k_i)$. Note that it is not stated but the $X_i$ must be identically distributed and the kth moment must exist. Also it is the sample estimate of the kth moment that converges to the kth moment of the populationdistribution and not the other way around. – Michael R. Chernick Jan 16 '17 at 06:40
  • Doesn't this follow from Central Limit Theorem? i.e. as the sample-size increases, sample-parameters get closer and closer to population-parameters? – Ujjwal Kumar Jan 16 '17 at 07:26
  • 2
    Possible duplicate of [What is the logic behind method of moments?](http://stats.stackexchange.com/questions/129183/what-is-the-logic-behind-method-of-moments) – Xi'an Jan 16 '17 at 11:33
  • 1
    @UjjwalKumar: what you describe is the Law of Large Numbers. The CLT is not needed here. – Xi'an Jan 16 '17 at 11:34
  • @Xi'an I don't think so. It doesn't really explain why the theoretical and the so-called sample moments are approximately the same when n goes to infinity. – Peter Series Jan 16 '17 at 11:51

0 Answers0