Suppose that $X_1,...,X_n$ is an iid random sample from a Poisson distribution with mean $\theta$. I would like to prove that there exists no unbiased estimator of $\frac{1}{\theta}$.
To do so, I will let $\delta(X)$ be an estimator of $\frac{1}{\theta}$.
Then, I'd like to equate the expectation of $\delta(X)$ and $\frac{1}{\theta}$
E$[\delta(X)]$ = sum from x=0 to infinity of [ $\delta(x)$$P(X=x)$ ].
Now, my problem is that some books let $Y= \sum{X_i}$ and then have:
E$[\delta(Y)]$ = sum from y=0 to infinity of [ $\delta(y)$$P(Y=y)$ ].
How are these two methods equivalent? How is it that the sum of the X's some how is equivalent to $X_1,...,X_n$?
In other words, why is: $\delta(X_1,...,X_n) = \delta(Y)=\delta(\sum{X_i})$?
Thanks!!!