Let $X_1,...X_n$ be $\text{Poi}(\lambda)$ distributed random variables. I want to construct a minimal variance unbiased estimator (MVUE) for $\lambda$.
By the Neyman Lemma, I know that $T:=\sum_{i=1}^nX_i$ is a minimal sufficient statistic for $\lambda$.
Now I want to use the Lehmann-Scheffe Theorem, which states that if we have a sufficient and complete statistic $T$ for a parameter $\lambda$ and find an unbiased function $\phi(T)$, then $\phi(T)$ is a MVUE for $\lambda$
I want to show that $T$ is complete. Given a measurable function $g$ I want to show that $\mathbb{E}_\lambda[g(T)]=0\Rightarrow g(T)=0$ almost surely.
Since $T\sim \text{Poi}(n\lambda)$ we have that $\mathbb{E}_\lambda[g(T)]=\sum_{k=0}^\infty g(k) e^{-\lambda n}\frac{(n\lambda)^k}{k!}=e^{-n\lambda}\sum_{k=0}^\infty \frac{g(k)}{k!}z^k$ where $z:=n\lambda$. Now by the theory of power series we must have that $\frac{g(k)}{k!}=0\Rightarrow g(k)=0$ which is what we wanted to show.
If I can construct an unbiased estimator which is a function of $T$, I am done. So lets check $\mathbb{E}[T]=n\lambda$. I correct it by $S:=T\frac{1}{n}$ and it follows by Lehmann-Scheffe Lemma that $S(T)$ is a MVUE of $\lambda$.
Is this correct?