3

Let $X_1, \ldots, X_n$ be a sample from an exponential distribution with p.d.f. $f(x; \theta) = \theta e^{-\theta x}$ for $x > 0$ where $\theta > 0$ is an unknown parameter.

I would like to find the UMVUE of $e^{-\theta} - e^{-2\theta}$, but I've been struggling to do so.

I know that $T(X)$ is the UMVUE if $\text{Var}(T(X)) \leq \text{Var}(U(X))$ for any other unbiased estimator $U(X)$ of the expression. I've read through several examples in my textbook, but this is one problem that I am having some difficult with as I study. I am also aware of Lehmann-Scheffe's Theorem, which seemed to be useful in a couple of the examples I saw; however, my book only has two examples.

I've seen the example that $\sum_i X_i/n$ is the UMVUE for the parameter of a Poisson distribution, and I've seen that $(n + 1)X_{(n)}/n$ is the UMVUE for a uniform distribution with parameter $\theta$, but I'm not quite sure how to solve this other problem.

I thought that exponential family of distributions might be helpful here, but I'm not sure.

I would really appreciate any assistance with this problem. I've looked online for more examples but can't really find anything similar to this problem.

1 Answers1

3

Note that $P_{\theta}(X_1>x)=e^{-\theta x}$ for every $x>0$ and for every $\theta >0$, so you have

$$g(\theta)=e^{-\theta}-e^{-2\theta}=P_{\theta}(X_1>1)-P_{\theta}(X_1>2)=P_{\theta}(1<X_1<2)$$

Hence an unbiased estimator of $g(\theta)$ is the indicator variable $I_{1<X_1<2}$.

Therefore by Lehmann-Scheffé theorem, UMVUE of $g(\theta)$ is the conditional expectation $E\left[I_{1<X_1<2}\mid T\right]$ where $T=\sum\limits_{i=1}^n X_i$ is a complete sufficient statistic.

Sufficiency of $T$ follows from Factorization theorem. Completeness of $T$ can be shown directly from definition or by arguing that the joint density of $X_1,\ldots,X_n$ is a member of a full-rank (regular) exponential family.

Now $E\left[I_{1<X_1<2}\mid T\right]=P\left[1<X_1<2\mid T\right]$, so to find this quantity you can refer to:


Using the fact that $\frac{X_1}T\sim \text{Beta}(1,n-1)$ is independent of $T$ (explained in linked posts), we find that for any $a>0$,

\begin{align} P(X_1<a\mid T)&=P\left(\frac{X_1}T<\frac aT\right) \\&=(n-1)\int_0^{\min\left\{\frac aT,1\right\}}(1-x)^{n-2}\,\mathrm dx \\&=\begin{cases} 1-\left(1-\frac aT\right)^{n-1} &,\text{ if } T>a \\ 1 &,\text{ if }0\le T\le a \end{cases} \end{align}

This suggests to me that $P(X_1<2\mid T)-P(X_1<1\mid T)$ is also a piecewise function of $T$.

StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
  • I read the second answer, and I tried writing $P(1 < X_1 < 2 \mid T) = P(X_1 < 2 \mid T) - P(X_1 < 1 \mid T)$, but I'm still a little bit confused because they go from $T$ to conditioning on the actual value of the sum when I think that the final answer should be a function of $T$. – user7423043 May 18 '21 at 08:44
  • The answer *is* a function of $T$. If $E[I_{1 – StubbornAtom May 18 '21 at 09:49
  • Ok. Following my previous comment and your second link, would the answer just be $\left(1 - \frac{2}{T}\right)^{n - 1}- \left(1 - \frac{1}{T}\right)^{n - 1}?$ – user7423043 May 18 '21 at 12:49
  • Not quite. The UMVUE depends on the range of $T$, if you follow the calculation closely. – StubbornAtom May 18 '21 at 15:03
  • Thanks for your help so far. I've gone through the derivation probably 10 times, and I'm still a little bit confused as to what could be different in this case. – user7423043 May 18 '21 at 18:04
  • What do you mean by depends on the range of $T$? Aren't the ranges the same? – user7423043 May 18 '21 at 18:27