3

Say I'm interested in estimating the $\sigma^2$ of a normal population, which has a Cramer-Rao Lower Bound of $\frac{2\sigma^4}{n}$ if my calculations are correct. I found that the uniformly minimum variance unbiased estimator of $\sigma^2$ should be $\frac{1}{n} \sum_{i = 1}^n (x_i - \mu)^2$, using the equality condition for the Cauchy-Schwarz Inequality. Does this mean that if the population mean is unknown, then I have no UMVUE?

I have verified that $\frac{1}{n-1} \sum_{i = 1}^n (x_i - \bar{X})^2$ does not reach the CRLB stated above (although it achieves the bound asymptotically).

WavesWashSands
  • 599
  • 4
  • 16
  • Sorry, I mean $\sigma^2$ is also supposed to be unknown... – WavesWashSands Oct 20 '16 at 06:32
  • 2
    When $\mu$ and $\sigma$ are unknown, $\frac{1}{n-1} \sum_{i = 1}^n (x_i - \bar{X})^2$ is the UMVUE, even though it does not reach the Cramer-Rao lower bound. Which is a lower bound, not a minimal value that some estimator should reach. – Xi'an Oct 20 '16 at 06:36
  • 2
    *Addendum:* Using the Cramer-Rao lower bound is a way to _prove_ an estimator is UMVUE, not a necessary property of the UMVUE. – Xi'an Oct 20 '16 at 06:41
  • I see, thanks. In that case, how can I show that $\frac{1}{n-1} \sum_{i = 1}^n (x_i - \bar{X})^2$ is the UMVUE? (Sorry but I don't seem to have the right to vote up comments in this SE.) – WavesWashSands Oct 20 '16 at 06:45
  • 2
    You can use the [Lehmann-Scheffé theorem.](https://en.wikipedia.org/wiki/Lehmann%E2%80%93Scheff%C3%A9_theorem) – Xi'an Oct 20 '16 at 10:14
  • https://stats.stackexchange.com/q/250917/119261 – StubbornAtom Nov 13 '21 at 14:59

0 Answers0