4

Let $X_i$ be i.i.d $exp(\lambda)$ and take any $a > 0$.

I want to find the UMVUE of $P(X_i < a ) = 1-\exp(-\lambda a)$.

My attempt

By properties of the exponential family we know $\sum_i X_i$ is a complete sufficient statistic for $\lambda$.

Hence we only need to:

  1. Find an unbiased estimator of $P(X_i < a)$

  2. Condition it on $T(X) = \sum_i X_i$ and apply Lehmann-Scheff Theorem

An obvious candidate for an unbiased estimator is using indicator functions. For example

$$\delta (X) = 1(X_1 < a)$$

Is an unbiased estimator. Hence the estimator

$$T(X) = E[\delta(X) | T(X) = t]$$

Is the UMVUE of $P(X < a)$. However, I am having trouble deriving this. The problem I am having is the fact I have conditioned a continuous random variable on a discrete condition $T(X) = t$. How can I proceed?

Xiaomi
  • 2,276
  • 6
  • 19
  • That you have conditioned on a continuous random variable taking a particular value is not a problem. We are considering the event $\{t-h\le T\le t+h\}$ as $h\to 0$ as is usually the sense for conditional distributions in the continuous case. – StubbornAtom Oct 27 '18 at 18:34
  • I reckon [this](https://stats.stackexchange.com/questions/172215/find-the-joint-distribution-of-x-1-and-sum-i-1n-x-i/7) post should answer your question. I had found the conditional expectation directly (conditional probability) using well-known results involving the Beta and Gamma distributions in my answer. – StubbornAtom Oct 27 '18 at 18:58
  • Wow! That's a really clever work around to avoid the integration. Thanks @StubbornAtom – Xiaomi Oct 28 '18 at 01:07
  • @StubbornAtom apologies for asking the same question twice! I am reworking through Lehmann-Casella and forgot I had asked it. I have given an alternative answer (hence bumping this) as I think it will be useful to people who might stumble across this. It is similar to the method used for normal probabilities. – Xiaomi May 19 '19 at 07:22

1 Answers1

4

Going to post a late answer as it's quite simple but useful trick. The UMVUE is indeed given by

$$P(X_i < a|T)$$

The trick is to use Renyi's representation which says that

$$\frac{X_1}{\sum_{i=1}^n X_i} =_d U_{(1)}$$

Where $U_{(1)}$ is the 1st order statistic from a uniform(0,1) random sample

$$U_1, U_2, \dots U_{n-1}$$

$$P(X_1 < a | \sum_i X_i = t) = P \left(\frac{X_1}{\sum_i X_i} < \frac{a}{t}|\sum_i X_i=t \right) = P\left(\frac{X_1}{\sum_i X_i} < \frac{a}{t}\right)$$

The last inequality follows from Basu's theorem which says every complete sufficient statistic is independent of every ancillary statistic. Since $X_i$ come from a scale family, it follows $X_i/\sum_i X_i$ is ancillary. Hence the probability is independent of the complete sufficient statistic $T(X) = \sum_i X_i$.

Hence by the above Renyi's representation, using the fact that the minimum of a Uniform(0,1) sample (of size $n-1$) has distribution $P(U_{(1)} \leq u ) = 1- (1-u)^{n-1}$, it follows that the UMVUE is given by

$$P(X_1 < a | \sum_i X_i = t) = P\left(U_{(1)} < \frac{a}{t} \right) = 1 - \left(1-\frac{a}{t} \right)^{n-1}$$

ie.

$\delta (X) = 1 - (1 - \frac{a}{\sum_i X_i})^{n-1}$ is UMVUE

Xiaomi
  • 2,276
  • 6
  • 19