Questions tagged [method-of-moments]

A method of parameter estimation by equating sample and population moments then solving the equations for the unknown parameters.

Method of Moments estimation (MoM or MME) is an approach to parameter estimation which equates sample and population moments to solve for unknown parameters.

Loosely, the Law of Large Numbers suggests that in as sample sizes become sufficiently large, the sample moments will approach the corresponding population moments; this is sometimes used to justify the MoM as a general approach.

While less commonly used than maximum likelihood estimation (MLE) – for example, it is quite often less efficient – it can be quite useful in many situations.

174 questions
58
votes
8 answers

Examples where method of moments can beat maximum likelihood in small samples?

Maximum likelihood estimators (MLE) are asymptotically efficient; we see the practical upshot in that they often do better than method of moments (MoM) estimates (when they differ), even at small sample sizes Here 'better than' means in the sense…
Glen_b
  • 257,508
  • 32
  • 553
  • 939
26
votes
1 answer

Link between moment-generating function and characteristic function

I am trying to understand the link between the moment-generating function and characteristic function. The moment-generating function is defined as: $$ M_X(t) = E(\exp(tX)) = 1 + \frac{t E(X)}{1} + \frac{t^2 E(X^2)}{2!} + \dots + \frac{t^n…
26
votes
5 answers

Maximum Likelihood Estimation -- why it is used despite being biased in many cases

Maximum likelihood estimation often results into biased estimators (e.g., its estimate for the sample variance is biased for the Gaussian distribution). What then makes it so popular? Why exactly is it used so much? Also, what in particular makes it…
Minaj
  • 1,201
  • 1
  • 12
  • 21
23
votes
3 answers

What is the logic behind method of moments?

Why in "Method of Moments", we equate sample moments to population moments for finding point estimator? Where is the logic behind this?
user 31466
  • 1,197
  • 13
  • 31
20
votes
4 answers

What exactly are moments? How are they derived?

We are typically introduced to method of moments estimators by "equating population moments to their sample counterpart" until we have estimated all of the population's parameters; so that, in the case of a normal distribution, we would only need…
16
votes
1 answer

When do maximum likelihood and method of moments produce the same estimators?

I was asked this question the other day and had never considered it before. My intuition comes from the advantages of each estimator. Maximum likelihood is preferably when we are confident in the data generating process because, unlike the method of…
16
votes
3 answers

What is the Method of Moments and how is it different from MLE?

In general it seems like the method of moments is just matching the observed sample mean, or variance to the theoretical moments to get parameter estimates. This is often the same as MLE for exponential families, I gather. However, it's hard to…
frelk
  • 1,117
  • 1
  • 8
  • 19
15
votes
1 answer

What is the difference/relationship between method of moments and GMM?

Can someone explain to me the difference between method of moments and GMM (general method of moments), their relationship, and when should one or the other be used?
Vivi
  • 1,241
  • 2
  • 14
  • 20
14
votes
1 answer

Is ANOVA relying on the method of moments and not on the maximum likelihood?

I see mentioned in various places that ANOVA does its estimation using the method of moments. I am confused by that assertion because, even though I am not familiar with the method of moments, my understanding is that it is something different from…
amoeba
  • 93,463
  • 28
  • 275
  • 317
11
votes
2 answers

Explaining generalized method of moments to a non-statistician

How do I explain Generalized Methods of moments and how it is used to a non statistician? So far I am going with: it is something we use to estimate conditions such as averages and variation based on samples we have collected. How do I explain the…
user3084006
  • 368
  • 2
  • 9
11
votes
2 answers

How do I know which method of parameter estimation to choose?

There are quite a few methods for parameter estimation out there. MLE, UMVUE, MoM, decision-theoretic, and others all seem like they have a fairly logical case for why they are useful for parameter estimation. Is any one method better than the…
9
votes
2 answers

Parameter estimates for the triangular distribution

A question was posted here (now deleted) in relation to estimating the parameters of the triangular distribution, which has density $$f(x;a,b,c)=\begin{cases} \quad 0 & \text{for } x < a, \\ \frac{2(x-a)}{(b-a)(c-a)} & \text{for } a \le x \le…
8
votes
2 answers

Derivation of the Satterthwaite appproximation

Using the method of moments, one can try to approximate the sum of $\chi_{r}^{2}$ variables as $\sum a_{i}Y_{i}$ by equating the $n$-th movements of the sample with the $n$-th movement of the population, and "solve" the parameters this way. However…
Bombyx mori
  • 687
  • 1
  • 6
  • 17
8
votes
2 answers

Real life uses of Moment generating functions

In most basic probability theory courses your told moment generating functions (m.g.f) are useful for calculating the moments of a random variable. In particular the expectation and variance. Now in most courses the examples they provide for…
8
votes
1 answer

Gaussian Mixture and Method of Moments

Given solely the first $n$ moments $m_1,\dots,m_n$ of a random variables $X\in\mathbb{R}$, I was wondering whether there exists a direct methodology to approximate $X$ with a Gaussian Mixture ?
1
2 3
11 12