7

Here is an excerpt from one of the stats books I have been reading:

enter image description here

But as a counter example, let's suppose we have five possible values for $\theta$ and $\theta_5$ is the ML estimate, with the likelihood 0.4, and we have a function $f(\theta)$ like this:

enter image description here

Clearly $\hat{f(\theta)} = 1 \ne 0 = f(\hat{\theta})$.

qed
  • 2,508
  • 3
  • 21
  • 33
  • 2
    Please explain how you derive your last statement. It seems to come from some assumption of additivity of likelihood but the context in which that would be true is not at all clear to me. – whuber Dec 26 '13 at 19:56
  • Find the sufficiency estimator for the parameter in poisson distribution – Sai Brunda Dec 13 '18 at 11:39
  • @SaiBrunda, that doesn't seem to answer anything about the question. Could you please elaborate the answer somewhat? – cherub Dec 13 '18 at 11:58

1 Answers1

12

You seem to be confusing what it means for a parameter transformation to occur. In general, the values of the likelihood functions do not change. To illustrate, let $L(\theta; x)$ be a likelihood function and let $\lambda = g(\theta)$ where $g$ is one-to-one. Then the likelihood function parameterized in terms of $\lambda$ is

$$L^*(\lambda; x) = L(g^{-1}(\lambda) ;x) = L(\theta;x)$$

Some trouble occurs when we want to use a function that isn't one-to-one for $g$. In that case we define the likelihood function parameterized in terms of $\lambda$ through the use of profile likelihood as:

$$L^*(\lambda; x) = \sup_{\theta: g(\theta) = \lambda} L(\theta; x)$$

Using these definitions in your example, if $L(\theta_5;x) = 0.4$ then $L^*(0;x) = 0.4$ as well and the actual likelihood values do not change. We also see that $g(\theta_5) = 0$ so there does not appear to be any contradiction. I'll leave the general proof that the MLE is invariant to any parameter transformations up to the interested reader.

Samuel Benidt
  • 534
  • 3
  • 5