This is an understanding question regarding the method moments: It's basic idea is that any $k^{th}$ moment $m_{k}$ converges in probability to its empiric estimator as follows:
$$m_{k} = \frac{1}{n} \sum_{i=1}^{n} X_{i}^k \rightarrow^{p} E(X_{i}^k) = \mu_{k}(\theta)$$
where $X_{i}$ is the random variable. (How can I display the convergence better? I would edit it if someone has a clue)
Now my question is simple: Why is that true? Can someone prove this?
My best guess was to use the continuous mapping theorem, but what continuous function g(x) would describe this mapping? My idea was probably a function that "squares every value $X_i$", but I am not sure whether this is a continuous function, neither I know whether this is true in general.