1

I see that when talking about estimator, we have "of something", where "something" refers to a fixed parameter. For example, we say that the sample mean is an estimator of the population mean (a fixed parameter), or the sample variance is an estimator of the population variance (of course, a fixed parameter).

However, it seems to me that "of something" disappears in the definition of M-estimators.

In 1964, Peter J. Huber proposed generalizing maximum likelihood estimation to the minimization of $$\sum\limits_{i=1}^{n}\rho(x_{i},\theta)$$ where $\rho$ is a function with certain properties (see below). The solutions $$\hat{\theta}=\arg\min_{\theta}\sum _{i=1}^{n}\rho (x_{i},\theta)$$ are called M-estimators ("M" for "maximum likelihood-type" (Huber, 1981, page 43)).

Here, $\theta$ is a parameter but not fixed (otherwise the argmin makes no sense). It makes me confused because I do not see of something in the definition. Specifically, $\widehat{\theta}$ in the definition above is an M-estimation of what?

Could you explain why we do not have of something here or what might be wrong with my understanding?

TrungDung
  • 749
  • 4
  • 13
  • 1
    If, for each allowable $\theta,$ there exists a probability limit of $\hat\theta$ as the sample size $n$ grows large, then that limit is a property of the distribution denoted by $\theta$ and you could consider it to be the estimand. But an estimator need not have a designated estimand or even a unique estimand. For instance (fixing $n$), the mean of a sample from a distribution of mean $\theta$ also estimates $\theta + 1/n$ (and does so very well for large $n$). What it all comes down to, then, is: how do *you* define an "estimator"? – whuber Feb 09 '22 at 21:18
  • @whuber: Yes, if it not easy to define an "estimator". In a textbook, function of observed data is used to define "estimator of ...". However, to me, estimand, or fixed parameter must exist first before we talk about estimation, estimate, or estimator. We can only say "we estimate something", we cannot say "we estimate" (and full stop). Therefore, I think that when talking about estimator, it should be "estimator of something" (although it could be a good/bad estimator). "That limit ... consider it to be the estimand": In this case, the estimand appears after the M-estimator. Is it logical? – TrungDung Feb 09 '22 at 21:43
  • Although that's the standard usage, the definition of an estimator is broader than that. Estimators ultimately are evaluated by assessing risks (or expected risks using a Bayes prior). Thus, the loss function implicitly determines the estimand, if you must have one: but in some applications, such as when what matters is how the estimator is used for making a decision or taking some action, one might not care what the estimand is. You still haven't offered a definition of "estimator:" you have given only an intuitive characterization. What *definition* are you working with? – whuber Feb 09 '22 at 21:47
  • I took it here: "Any statistic (known function of observable random variables that is itself a random variable) whose values are used to estimate $\tau(\theta)$, where $\tau(.)$ is some function of the parameter $\theta$, is defined to be an estimator **of** $\tau(\theta)$. Source: https://www.fulviofrisone.com/attachments/article/446/Introduction%20to%20the%20theory%20of%20statistics%20by%20MOOD.pdf. Page 273. We can also see this definition in Wiki. – TrungDung Feb 10 '22 at 08:25
  • That's a good start. So, given a putative estimator, what is to prevent someone from claiming it estimates the median of distribution $\theta,$ for instance? Nothing at all, because it has no effect on the *estimator.* In other words, it appears that $\tau(\theta)$ in this quotation has no actual role to play! – whuber Feb 10 '22 at 14:53

0 Answers0