1

Let $(X_1,...,X_n)$ be a random sample from a normal distribution $N(\mu,1)$.

$T=\frac{1}{n}\sum_{i=1}^{n}X_{i}^2-1$ is an unbiased estimator for $\mu^2$ since : $$ E(T)=E(\frac{1}{n}\sum_{i=1}^{n}X_{i}^2-1)$$ $$= \frac{1}{n}\sum_{i=1}^{n}E(X_{i}^2)-1$$ $$=\frac{1}{n}\sum_{i=1}^{n}[Var(X_{i}+E(X_{i})^2]-1$$ $$=\frac{1}{n}\sum_{i=1}^{n}[1+\mu^2]-1 = \mu^2$$

My question is , can we prove that $T$ is not the MVUE for $\mu^2$ based only on comparing its variance with the CRLB for $\mu^2$?. In other words does the MVUE attain the CRLB (or at least in this case where $\sigma^2$ is known ) ? .

I have tried to find the variance of $T$ and compare it with CRLB , but there is something wrong . Here is my attempt :

The CRLB for $q(\theta)= \mu^2$ is $\frac{4\mu^2}{n}$

$$Var(T)=Var(\frac{1}{n}\sum_{i=1}^{n}X_{i}^2-1)=\frac{1}{n}\sum_{i=1}^{n}X_{i}^2$$ $$=\frac{1}{n^2}\sum_{i=1}^{n} Var(X_{i}^2)$$

Now $Var(X_{i}^2)$ could be calculated as follows :

$$Var(X_{i}^2) = E[(X_{i}^2 - E(X_{i}^2))^2]= E[(X_{i}^2 - (1+\mu^2))^2]$$ $$=\int_{-\infty}^{+\infty}(X_{i}^2 - (1+\mu^2))^2f(x_{i},\mu)dx_{i}$$ $$=\int_{-\infty}^{+\infty}x_{i}^4f(x_{i},\mu)dx_{i} -2 (1+\mu^2)\int_{-\infty}^{+\infty}x_{i}^2f(x_{i},\mu)dx_{i}+(1+\mu^2)^2\int_{-\infty}^{+\infty}f(x_{i},\mu)dx_{i}$$ $$=\int_{-\infty}^{+\infty}x_{i}^4f(x_{i},\mu)dx_{i}-(1+\mu^2)^2$$ If we standardize the first term we will get $ Var(X_{i}^2) = 3-(1+\mu^2)^2$ then $Var(T)=\frac{3-(1+\mu^2)^2}{n}$ which could be negative ! any help please.

Bahgat Nassour
  • 1,513
  • 1
  • 8
  • 22
  • (1) $T$ is *explicitly* a function of $S(X)$! (2) Your calculations have numerous errors: please double-check them. (3) You can compute the variance much more easily. Start by computing the second and fourth moments of a Normal distribution and write the variance of $X^2$ in terms of them. – whuber Jun 03 '17 at 15:15
  • In light of the helpful edit that focuses this question on the issue of computing the variance of $X_i^2$, I have provided links to two threads that describe how to do that. To complete the answer you will need to find the moments of a normal distribution, as described at https://stats.stackexchange.com/questions/176702 or https://stats.stackexchange.com/questions/191902 – whuber Jun 03 '17 at 15:41
  • @whuber Thanks , but now my question is ,based on Lehmann Scheffe is $T$ the MVUE ? since it is unbiased and a function of the complete sufficient statistics ?! I edited my calculations a little bit – Bahgat Nassour Jun 03 '17 at 15:43
  • OK. I think it would help quite a bit to get the calculations right, though, for otherwise those errors are going to distract your readers from the question you really want answered. Indeed, after you compute the variance of $X_i^2$, you might easily be able to determine whether $T$ is an MVUE or not. – whuber Jun 03 '17 at 15:45
  • @whuber Ok I will try to get the calculations right, , but this is now a fundamental question , if an estimator is unbiased and a function of the complete sufficient statistics , do i need to check its variance ? I mean that is enough to prove that the estimator is the MVUE , is that right ? – Bahgat Nassour Jun 03 '17 at 15:50
  • 1
    Don't you answer that question yourself at https://stats.stackexchange.com/questions/280354/uniqueness-of-mvue? Please also visit the very closely related questions (maybe they're the same?) at https://stats.stackexchange.com/questions/152402 and https://stats.stackexchange.com/questions/188556. Also see the analysis of question (c) in the problem statement at https://stats.stackexchange.com/questions/164519. – whuber Jun 03 '17 at 15:59

0 Answers0