3

The following is a homework question from an undergrad intro stats class.

"Suppose we have a random sample $Y_1, \dots, Y_n$ from the shifted exponential distribution $$ f(y|\theta) = \begin{cases} e^{-(y-\theta)} & y \geq 0 \\ 0 & y < 0. \end{cases} $$ Find MLE for $\text{Var}(Y)$, where $Y \sim f$."

Now I was under the impression that MLEs are only defined for the explicit parameters $\theta$ of the distribution, whereas here $\text{Var}(Y) = 1$ is not a parameter of $f$ and indeed doesn't even depend on $\theta$. So in my view asking for the MLE of $\text{Var}(Y)$ is not a well-posed question.

On the other hand, my friend argued that the MLE for $\text{Var}(Y)$ does exist and is equal to $1$. His reasoning was that the likelihood function is not defined except when $\text{Var}(Y) = 1$, and so $1$ is trivially the maximum likelihood estimate.

Whose view is more correct? And as a side question, do you think this question is too focused on a technical detail of maximum likelihood estimation to be included on a first homework about MLEs?

tddevlin
  • 3,205
  • 1
  • 12
  • 27

1 Answers1

1

Well, it looks like a strange question for a first homework set on mle. But, technically, the answer you gave, 1 (the constant function 1) seems to be correct. Maximum likelihood is not only defined for "explicit parameters", whatever that is. We can define a (finite-dimensional) parameter of a distribution as a functional defined on the distributions (in the given family). This way: $\theta = \theta (F)$, in your example we have $\sigma^2 = \sigma^2(F)$, the functional that to each $F$ (distribution function) assigns the value $\DeclareMathOperator{\E}{\mathbb{E}} \sigma^2(F) = \E (X-\E(X) )^2$ where $X$ is a random variable with the distribution $F$.

For the invariance property of mle useful in this context, see Invariance property of maximum likelihood estimator?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467