I see that when talking about estimator, we have "of something", where "something" refers to a fixed parameter. For example, we say that the sample mean is an estimator of the population mean (a fixed parameter), or the sample variance is an estimator of the population variance (of course, a fixed parameter).
However, it seems to me that "of something" disappears in the definition of M-estimators.
In 1964, Peter J. Huber proposed generalizing maximum likelihood estimation to the minimization of $$\sum\limits_{i=1}^{n}\rho(x_{i},\theta)$$ where $\rho$ is a function with certain properties (see below). The solutions $$\hat{\theta}=\arg\min_{\theta}\sum _{i=1}^{n}\rho (x_{i},\theta)$$ are called M-estimators ("M" for "maximum likelihood-type" (Huber, 1981, page 43)).
Here, $\theta$ is a parameter but not fixed (otherwise the argmin makes no sense). It makes me confused because I do not see of something in the definition. Specifically, $\widehat{\theta}$ in the definition above is an M-estimation of what?
Could you explain why we do not have of something here or what might be wrong with my understanding?