0

I searched for "rao blackwell poisson", "umvu poisson", and "umvue poisson" on this website but didn't find anything that specifically answered my question. I also searched on Google for "umvu poisson" and found a bunch of lecture notes, but nothing answering my question.

Setup

I am studying An Introduction to Probability and Statistical Inference, second edition, written by George G. Roussas. I am trying to solve this problem from page 313:

Example 17. Determine the UMVU estimate of $\theta$ on the basis of the random sample $X_1, \dots, X_n$ from the distribution $P(\theta)$.

For context, I quote Theorem 7 from page 312:

Theorem 7 (Rao-Blackwell and Lehmann-Scheffé). Let $X_1, \dots, X_n$ be a random sample with p.d.f. $f(\cdot ; \theta)$, $\theta \in \Omega \subseteq \mathbb{R}$, and let $T = T(X_1, \dots, X_n)$ be a sufficient statistic for $\theta$ (which is complete). Let $U = U(X_1, \dots, X_n$) be any unbiased estimate of $\theta$, and define the statistic $\varphi (T)$ by the relation

\begin{align} \varphi (T) = \text{E}_{\theta} (U \mid T) \tag{11}\label{eq11}. \end{align}

Then $\varphi (T)$ is unbiased, $\text{Var}_{\theta} \big( \varphi (T) \big) \leq \text{Var}_{\theta}(U)$ for all $\theta \in \Omega$, and, indeed, $\varphi (T)$ is a UMVU estimate of $\theta$. If $U$ is already a function of $T$ only, then the conditioning in $(11)$ is superfluous.

My approach

I need an unbiased estimator $U$ and sufficient (and complete) statistic $T$. It's fairly straightforward to figure out that $U = \bar{X}$ is unbiased and $T = \sum_{i = 1}^n X_i$ is sufficient. (I did not verify completeness.)

Since I can write $U = \frac{1}{n}T$, Theorem 7 tells me that I'm done: I don't need to bother with the conditional expectation. However, I wanted to do it anyway, and this is where I run into trouble.

Is computing $\text{E} \big( \bar{X} \mid \sum_{i = 1}^n X_i \big) = \text{E}(U \mid T)$ tractable? The first problem I see is that $\bar{X}$ can take non-integer values, so that makes me think I'm going to need to integrate from zero to infinity. The second problem is figuring out $f_{U \mid T} (u \mid t)$, i.e. the conditional distribution needed to compute the expectation.

Going back to the definition of conditional probability, I know that $f_{U \mid T} (u \mid t) = \frac{f_{U, T} (u, t)}{f_T(t)}$. The joint probability in the numerator looks "redundant" in the sense that $(U = u \cap T = t) = (U = u) = (T = t)$.

If I eliminate the $U$ part, I am left with $\frac{f_T(t)}{f_T(t)} = 1$, which doesn't make sense. If I eliminate the $T$ part, I am left with $\frac{f_U(u)}{f_T(t)}$ and then I need to figure out how $U$ and $T$ are distributed. $T$ is easy, but $U$ is probably harder.

Am I way off track here?

George Roussas' approach

Roussas lets $U = X_1$ and uses the same $T$ as I did. He then uses the fact that $X_1 \mid T = t \sim B(t, \frac{1}{n})$ so that $\text{E}(X_1 \mid T) = \frac{T}{n} = \bar{X}$. I am supposed to verify completeness of $T$ in Exercise 3.17 to conclude that $\bar{X}$ is UMVU for $\theta$.

Summary

Is the moral of the story here that, if you want to compute the Rao-Blackwell conditional expectation, then you should pick a simple unbiased estimator for which the conditional distribution is easily found?

Thanks.

Novice
  • 531
  • 2
  • 9
  • weclome to CV. To answer your last question, yes. Generally, you might be able to use a more complicated expression and condition on that sufficient statistic but generally you want to pick the simplest representation possible. UMVU determination can be difficult enough, why make your work harder? This is a case where non-uniqueness of sufficient statistics works in your favor, leverage it if you can. Hope it helps. – Lucas Roberts May 17 '20 at 17:07
  • 1
    You are making things complicated. $E\left[\overline X\mid \sum_{i=1}^n X_i\right]=\frac1nE\left[\sum_{i=1}^n X_i\mid \sum_{i=1}^n X_i\right]=\frac1n\sum_{i=1}^n X_i$ for any $X_1,\ldots,X_n$. As for completeness of $\sum_{i=1}^n X_i$, it follows from the fact that Poisson$(\theta)$ distribution is a member of a regular exponential family. – StubbornAtom May 17 '20 at 19:05
  • You don't even need the conditional distribution: https://stats.stackexchange.com/q/374989/119261. – StubbornAtom May 17 '20 at 19:29

0 Answers0