4

Suppose $X_1, ..., X_4$ are i.i.d $\mathsf N(\mu, \sigma^2)$ random variables. Give the UMVUE of $\frac{\mu^2}{\sigma}$ expressed in terms of $\bar{X}$, $S$, integers, and $\pi$.

Here is a relevant question.

I first note that if $X_1,...,X_n$ are i.i.d $\mathsf N(\mu,\sigma^2)$ random variables having pdf

$$\begin{align*} f(x\mid\mu,\sigma^2) &=\frac{1}{\sqrt{2\pi\sigma^2}}\text{exp}\left(-\frac{(x-\mu)^2}{2\sigma^2}\right)\\\\ &=\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{\mu^2}{2\sigma^2}}\text{exp}\left(-\frac{1}{2\sigma^2}x^2+\frac{\mu}{\sigma^2}x\right) \end{align*}$$

where $\mu\in\mathbb{R}$ and $\sigma^2\gt0$, then

$$T(\vec{X})=\left(\sum_{i=1}^n X_i^2, \sum_{i=1}^n X_i\right)$$

are sufficient statistics and are also complete since $$\{\left(-\frac{1}{2\sigma^2},\frac{\mu}{\sigma^2}\right):\mu\in\mathbb{R}, \sigma^2\gt0\}=(-\infty,0)\times(-\infty,\infty)$$

contains an open set in $\mathbb{R}^2$

I also note that the sample mean and sample variance are stochastically independent and so letting

$$\overline{X^2}=\frac{1}{n}\sum_{i=1}^n X_i^2$$

$$\overline{X}^2=\frac{1}{n}\sum_{i=1}^n X_i$$

we have

$$\mathsf E\left(\frac{\overline{X^2}}{S}\right)=\mathsf E\left(\overline{X^2}\right)\cdot\mathsf E\left(\frac{1}{S}\right)=\overline{X^2}\cdot\mathsf E\left(\frac{1}{S}\right)$$

It remains only to find $\mathsf E\left(\frac{1}{S}\right)$

We know that $$(n-1)\frac{S^2}{\sigma^2}\sim\chi_{n-1}^2$$

Hence

$$\begin{align*} \mathsf E\left(\frac{\sigma}{S\sqrt{3}}\right) &=\int_0^{\infty} \frac{1}{\sqrt{x}} \cdot\frac{1}{\Gamma(1.5)2^{1.5}}\cdot\sqrt{x}\cdot e^{-x/2}dx\\\\ &=\frac{4}{\sqrt{\pi}\cdot2^{1.5}} \end{align*}$$

So $$\mathsf E\left(\frac{1}{S}\right)=\frac{4\sqrt{3}}{\sqrt{\pi}\cdot 2^{1.5}\cdot \sigma}$$

But since $\mathsf E(S)\neq\sigma$ I don't think I can just plug in $S$ for $\sigma$ here.

I have that since $\mathsf E\left(\overline{X^2}\right)=\mathsf{Var}\left(\overline{X}\right)+\mathsf E\left(\bar{X}\right)^2=\frac{\sigma^2}{4}+\mu^2$

Hence

$$\sigma=\sqrt{4\left(E\left(\overline{X^2}\right)-E\left(\overline{X}\right)^2\right)}=\sqrt{4\left(\overline{X^2}-\overline{X}^2\right)}$$

Hence the UMVUE of $\frac{\mu^2}{\sigma}$ is

$$\frac{4\sqrt{3}\cdot\overline{X^2}}{\sqrt{\pi}\cdot 2^{1.5}\cdot \sqrt{4\left(\overline{X^2}-\overline{X}^2\right)}}=\frac{\sqrt{\frac{3}{2\pi}}\left(\frac{S^2}{4}+\bar{X}^2\right)}{\sqrt{\frac{S^2}{4}}}$$

Is this a valid solution?

Remy
  • 1,548
  • 9
  • 18
  • 1
    One what additional thing do you think the final result depends? It seems to me you can construe the constant multiple of $4\sqrt{3}/(\sqrt{\pi}2^{3/2}\sqrt{4})= \sqrt{3/(2\pi)}$ as "depending" only on $\pi$ and the integers $2$ and $3.$ The other part of the answer is a function of $\bar X$ and $S$ only. – whuber Oct 26 '18 at 20:37
  • That is a good point. The professor does say it's okay to have square roots. $\overline{X^2}$ does not depend on $\bar{X}$ though. – Remy Oct 26 '18 at 20:40
  • 1
    No, but as you noted earlier, it is a function of $S$ and $\bar X.$ – whuber Oct 26 '18 at 20:41
  • Would that function be $\overline{X^2}=\frac{S^2}{4}+\bar{X}^2$ since $S^2$ is an unbiased estimator of $\sigma^2$? – Remy Oct 26 '18 at 21:07
  • 1
    I don't understand the expression of $\sigma$ you wrote in your second to last line. And I think it is $\overline X^2$ ( $\overline X$ squared), not $\overline{X^2}$ as you have written. – StubbornAtom Oct 27 '18 at 07:21
  • 1
    I mistakingly used $\overline{X^2}$ (the average of $X_i^2$) as the estimate of $\mu^2$. – Remy Oct 27 '18 at 07:30
  • 1
    Please correct your question as the notation $\bar{X^2}$ is confusing. It would help if you write $\bar{X}=...$ and $S=...$ in your question. – Xi'an Oct 27 '18 at 07:48

2 Answers2

3

I have skipped some details in the following calculations and would ask you to verify them.

As usual, we have the statistics $$\overline X=\frac{1}{4}\sum_{i=1}^4 X_i\qquad,\qquad S^2=\frac{1}{3}\sum_{i=1}^4(X_i-\overline X)^2$$

Assuming both $\mu$ and $\sigma$ are unknown, we know that $(\overline X,S^2)$ is a complete sufficient statistic for $(\mu,\sigma^2)$. We also know that $\overline X$ and $S$ are independently distributed.

As you say,

\begin{align} E\left(\overline X^2\right)&=\operatorname{Var}(\overline X)+\left(E(\overline X)\right)^2 \\&=\frac{\sigma^2}{4}+\mu^2 \end{align}

Since we are estimating $\mu^2/\sigma$, it is reasonable to assume that a part of our UMVUE is of the form $\overline X^2/S$. And for evaluating $E\left(\frac{\overline X^2}{S}\right)=E(\overline X^2)E\left(\frac{1}{S}\right)$, we have

\begin{align} E\left(\frac{1}{S}\right)&=\frac{\sqrt{3}}{\sigma}\, E\left(\sqrt\frac{\sigma^2}{3\,S^2}\right) \\\\&=\frac{\sqrt{3}}{\sigma}\, E\left(\frac{1}{\sqrt Z}\right)\qquad\qquad,\,\text{ where }Z\sim\chi^2_{3} \\\\&=\frac{\sqrt{3}}{\sigma}\int_0^\infty \frac{1}{\sqrt z}\,\frac{e^{-z/2}z^{3/2-1}}{2^{3/2}\,\Gamma(3/2)}\,dz \\\\&=\frac{1}{\sigma}\sqrt\frac{3}{2\pi}\int_0^\infty e^{-z/2}\,dz \\\\&=\frac{1}{\sigma}\sqrt\frac{6}{\pi} \end{align}

Again, for an unbiased estimator of $\sigma$, $$E\left(\frac{1}{2}\sqrt\frac{3\pi}{2}S\right)=\sigma$$

So,

\begin{align} E\left(\frac{\overline X^2}{S}\right)&=E\left(\overline X^2\right)E\left(\frac{1}{S}\right) \\&=\left(\mu^2+\frac{\sigma^2}{4}\right)\frac{1}{\sigma}\sqrt\frac{6}{\pi} \\&=\sqrt\frac{6}{\pi}\left(\frac{\mu^2}{\sigma}+\frac{\sigma}{4}\right) \end{align}

Or, $$E\left(\sqrt{\frac{\pi}{6}}\,\frac{\overline X^2}{S}-\frac{\frac{1}{2}\sqrt\frac{3\pi}{2}S}{4}\right)=\frac{\mu^2}{\sigma}$$

Hence our unbiased estimator based on the complete sufficient statistic $(\overline X,S^2)$ is

\begin{align} T(X_1,X_2,X_3,X_4)&=\sqrt{\frac{\pi}{6}}\,\frac{\overline X^2}{S}-\frac{1}{8}\sqrt\frac{3\pi}{2}S \end{align}

By Lehmann-Scheffe, $T$ is the UMVUE of $\mu^2/\sigma$.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
  • Thank you! I will verify your calculations tomorrow. Is it necessary to show that $\bar{X}$ and $S^2$ are complete sufficient statistics since they are functions of $\sum X_i$ and $\sum X_i^2$? – Remy Oct 27 '18 at 07:33
  • 1
    @Remy If you have shown that $(\sum X_i,\sum X_i^2)$ is complete sufficient, then it follows that $(\bar X,S^2)$ is also complete sufficient as the latter is a one-to-one function of the former. – StubbornAtom Oct 27 '18 at 07:40
0

An R simulation to verify StubbornAtom's well-explained answer:

In the case of $\mu=3$ and $\sigma=7$ we have $$\frac{\mu^2}{\sigma}=\frac{9}{7}=1.285714$$

The simulation with $10^7$ trials gives $\widehat{\theta}=1.286482$

y=0
for(i in c(1:10^7))
{x<-rnorm(4,3,7)
y=y+sqrt(pi/6)*mean(x)^(2)/sd(x)-(1/8)*sqrt(3*pi/2)*sd(x)}
y/(10^7)

1.286482

This, however, took a while to run as for loops aren't very fast in R so if anyone has an alternative method to simulate this in R, I would be very interested.

Remy
  • 1,548
  • 9
  • 18