3

Let $X_1, X_2,\cdots,X_n$ be a random sample from the distribution with the p.d.f $$f(x)=\frac{1}{\beta -\alpha},\alpha<x<\beta $$ where $0<\alpha<\beta<\infty$. Obtain the minimum variance unbiased estimators of $\frac{\alpha+\beta}{2}$ and $\beta-\alpha$.

Here I try to use Rao Blackwell Method but I am not able to solve with that. Please help

StasK
  • 29,235
  • 2
  • 80
  • 165
Argha
  • 1,918
  • 1
  • 16
  • 25
  • The sample mean is minimum variance unbiased estimate for (α+β)/2. I don't think there is a UMVU estimator for β-α. – Michael R. Chernick Aug 06 '12 at 01:05
  • 8
    @MichaelChernick: Reference for your claims? (Both are false.) The sample mean is not even a function of the sufficient statistic! – cardinal Aug 06 '12 at 01:49
  • For what it is worth, if $n=3$ then taking the sample mean has a variance of $(\beta-\alpha)^2/36$ while taking the mean of the largest and smallest value has a slightly smaller variance of $(\beta-\alpha)^2/40$. – Henry Aug 06 '12 at 12:51
  • @cardinal What is an unbiased estimator for β-α? The sample mean is unbiased for (α+β)/2 I did not know that it did not have minimum variance. Also I didn't think about sufficient statistic for the mean of uniform. Thanks for the correction. – Michael R. Chernick Aug 06 '12 at 15:56
  • Never mind. I see it in Henry's answer. – Michael R. Chernick Aug 06 '12 at 16:02

1 Answers1

7

The sufficient statistic is $(\min X_i, \max X_i)$ so you might expect these minimum variance unbiased estimators to be something related to $$\frac{\max X_i + \min X_i}{2}$$ and $$\max X_i - \min X_i$$ respectively.

The first of these turns out to be the minimum variance unbiased estimator for $\frac{\alpha+\beta}{2}$ while the second is a biased estimator for $\beta-\alpha$ as it is usually too small: you can calculate its expectation to be $(\beta-\alpha)\frac{n-1}{n+1}$, and so multiply it by $\frac{n+1}{n-1}$ to get an unbiased estimator which turns out to be the minimum variance unbiased estimator.

Henry
  • 30,848
  • 1
  • 63
  • 107
  • 1
    (+1) And, you've left a little bit for the OP to do as well! :-) (Sufficiency, ironically enough, does not suffice for the conclusion.) – cardinal Aug 06 '12 at 11:37
  • 1
    It is not usually too small, it is always too small. Max X$_i$ < β with probability 1 and Min X$_i$>α with probability 1. – Michael R. Chernick Aug 06 '12 at 15:59
  • 1
    @Michael: Perhaps *[almost always](http://en.wikipedia.org/wiki/Almost_always)* would be a better way of stating it. It is effectively the maximum likelihood estimate. – Henry Aug 06 '12 at 16:41
  • @Henry I think you have a UMVUE for β-α which is also mle but it inflates Max X$_i$ -Min X$_i$ by the factor (n+1)/(n-1) to make it unbiased. β is an upper limit on the uniform and Max X$_i$ increasingly approaches it from below as n increases but it never equals it. Similarly Min X$_i$ decreases to α but never equals it. – Michael R. Chernick Aug 06 '12 at 18:42