My fellow classmates and I are stuck on a homework problem that is a three part problem to find the UMVUE of a Poisson distribution.
The problem goes like this:
Let X ~ Pois$(\lambda$), and we want to estimate $\theta=e^{-\lambda}$.
a) Find the Rao-Cramer lower bound for an unbiased estimator for $\theta$ (based on a sample of size n). Hint: The differentiation needs to be with respect to $\theta$, not $\lambda$.
So the majority of us used the Fisher Information Index to get:
$I_1(\theta) = \Sigma ((\frac{d}{d\theta}ln(e^{-\lambda}\frac{\lambda^x}{x!}))^2*e^{-\lambda}\frac{\lambda^x}{x!})$
We then substituted in our ln $e^{-\lambda}$ for $\theta$.
$I_1(\theta) = \Sigma ((\frac{d}{d\theta}ln(\theta\frac{\lambda^x}{x!}))^2*e^{-\lambda}\frac{\lambda^x}{x!})$
After that, we broke out the ln function:
$I_1(\theta) = \Sigma ((\frac{d}{d\theta}ln(\theta) + ln(\lambda^x) - ln(x!))^2*e^{-\lambda}\frac{\lambda^x}{x!})$
Then we took the derivative with respect to $\theta$.
$I_1(\theta) = \Sigma ((\frac{1}{\theta})^2*e^{-\lambda}\frac{\lambda^x}{x!})$
This is where we have a little difference of opinion because one of us came up with the final answer as being:
$\frac{ne^{\lambda}}{\theta}$
He took the final answer by cancelling one of the $\theta$s and then using the Taylor series to get $e^{\lambda}$
Where two of us got the answer as being:
$\frac{n}{\theta^2}$
We don't know which is right because either one could make sense.
If someone could please let us know on this, we would be grateful.
Part b we are also stuck on as well.
b) Show that $T=\Sigma X_i$ is a complete sufficient statistics for $\theta=e^{-\lambda}$
Now, from our notes, we have that a probability distribution belongs to an exponential family if it can be written in the form $e^{A(x)B(\theta)+C(X)+D(\theta)}$
So we start out and we get:
$e^{-\lambda}\frac{\lambda^x}{x!}$ $=e^{-\lambda}e^{ln(\frac{\lambda^x}{x!})}$ $=e^{-\lambda}e^{xln(\lambda)-ln(x!)}$
This is where we get stuck because we are not sure 1) where to put the T into this equation and 2) how are we supposed to relate this back to the theta.
So far, we have only come up with:
$=e^{ln(\theta+\Sigma X_i ln(\lambda)-ln(\Sigma X_i!)}$
We know we are probably completely and totally wrong on this.
When it comes to part c, of our equation, we are a bit stuck as to what to do.
Part c goes like:
If we use $T^* = (\frac{n-1}{n})^{\Sigma X_i}$ as an estimator for $\theta=e^{-\lambda}$, as the Rao-Blackwell Theorem indicated, combine your findings in Parts (a) and (b) to determine Var($T^*$).
We know that this is a UVMUE that can probably be solved by the Lehmann-Scheffe theorem.
The thing is that we just don't know how to really apply it or how we can use what we find from parts a and b to get Var($T^*$).
If anyone can help out with any parts of this problem, we will be very grateful because we are very, very, very stuck on this.