4

Given are the uniformly distributed samples $$x_n \overset{\text{iid}}{\sim} \mathcal{U}\left(\mu-\frac{w}{2}, \mu+\frac{w}{2}\right)$$ for $n = 1 \ldots N$.Then the UMVUE estimates of $\mu$ and $w$ are \begin{align} \hat\mu = \frac{1}{2} \Big( \max \{ x_n \} + \min \{ x_n \} \Big) \end{align} and \begin{align} \hat{w} = \frac{N+1}{N-1} \Big( \max \{ x_n \} - \min \{ x_n \} \Big). \end{align} I know that this is correct but even with an extensive search I could not find a specific source for this very statement or a short proof of it. Although I found/derived a lot of related information, e.g., that $(\min \{ x_n \}, \max \{ x_n \})$ constitutes a complete sufficient statistic, the ML estimates, and relation to the german tank problem. But not this specific thing.

If you could provide me with a source or a short proof, that'd be amazing. Thank you.

Dattelheyn
  • 153
  • 7

2 Answers2

1

By the Lehmann-Scheffe theorem, unbiased estimators that are functions of complete and sufficient statistics are UMVUEs. So it suffices to check that $\hat{\mu}$ and $\hat{w}$ are unbiased. This can be done by writing $X_i = w (U_i-1/2) + \mu$ where $U_i\sim Unif(0,1)$ and noting that $U_{(i)} \sim Beta(i, n-i+1)$.

Alain
  • 250
  • 1
  • 5
  • Thank you very much. I'll work through it in the near future and comment back. – Dattelheyn Jul 01 '18 at 20:22
  • Thanks, I was not aware of this nice theorem which completely answers my question. Yup, I had used order statistics and the Beta distribution to find the bias before asking my question. – Dattelheyn Aug 24 '18 at 14:12
1

After finding a complete sufficient statistic, there isn't much to do. The main work has been done.

Suppose we have $X_j\stackrel{\text{i.i.d}}\sim\mathcal U\left(\mu-\frac{\omega}{2},\mu+\frac{\omega}{2}\right)$ for $j=1,2,\cdots,n$.

Denote $\displaystyle X_{(1)}=\min_{1\le k\le n} X_k$ and $\displaystyle X_{(n)}=\max_{1\le k\le n} X_k$.

Now, $(X_{(1)},X_{(n)})$ is complete sufficient for $(\mu,\omega)$. So we only need to find unbiased estimators of $\mu$ and $\omega$ which are functions of the complete sufficient statistic. By the Lehmann-Scheffe theorem, those unbiased estimators are bound to be the UMVUE.

Define $Y_j=\frac{X_j-(\mu-\omega/2)}{\mu+\omega/2-(\mu-\omega/2)}=\frac{X_j-\mu+\omega/2}{\omega}$, so that $Y_j\stackrel{\text{i.i.d}}\sim\mathcal U(0,1)$ for all $j=1,2,\cdots,n$.

Note that the parameter space is given by $\mu+\frac{\omega}{2}>\mu-\frac{\omega}{2}\implies-\infty<\mu<\infty\,,\,\omega>0$.

Knowing that the $r$th order statistic $Y_{(r)}\sim\mathcal{Be}(r,n-r+1)$, we immediately have $E(Y_{(r)})=\frac{r}{n+1}$, i.e., $E\left(\frac{X_{(r)}-\mu+\omega/2}{\omega}\right)=\frac{r}{n+1}$. Thus,

$$E(X_{(1)})=\frac{\omega}{n+1}-\frac{\omega}{2}+\mu\tag{1}$$ $$E(X_{(n)})=\frac{n\omega}{n+1}-\frac{\omega}{2}+\mu\tag{2}$$

Since we only need the expectations of $X_{(1)}$ and $X_{(n)}$, we could have found them directly from the respective densities also. But this was easier.

Simply adding and subtracting equations $(1)$ and $(2)$ we solve for $\mu$ and $\omega$ :

$$E\left[\frac{X_{(1)}+X_{(n)}}{2}\right]=\mu$$ and

$$E\left[\frac{n+1}{n-1}\left(X_{(n)}-X_{(1)}\right)\right]=\omega$$

This proves the claim.

(The question seems to have an extra $1/2$ factor for the UMVUE of $\omega$ though).

Although it is certainly not required for the problem at hand, to directly apply Lehmann-Scheffe, it can be shown that as $X_1$ is unbiased for $\mu$, UMVUE of $\mu$ is $E(X_1\mid X_{(1)},X_{(n)})=\frac{X_{(1)}+X_{(n)}}{2}$.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67