Question: Let $X_1,X_2,\dots,X_n$ be random sample from Poisson($\theta$). Find MVUE of $e^{-2\theta}$
My attempt has been by modifying the answer from this question:
The Poisson distribution is a one-parameter exponential family distribution, with natural sufficient statistic given by the sample total $T(\mathbf{x}) = \sum_{i=1}^n x_i$. The canonical form is:
$$p(\mathbf{x}|\theta) = \exp \Big( \ln (\theta) T(\mathbf{x}) - n\theta \Big) \cdot h(\mathbf{x}) \quad \quad \quad h(\mathbf{x}) = \coprod_{i=1}^n x_i! $$
From this form we establish that $T$ is a complete sufficient statistic for the parameter $\theta$. So the Lehmann–Scheffé theorem means that for any $g(\theta)$ there is only one unbiased estimator of this quantity that is a function of $T$, and this is the is UMVUE of $g(\theta)$.
Using Rao-Blackwell to find the UMVUE: I want to find the UMVUE of:
$$g(\theta) \equiv \exp (-2\theta).$$
Using the initial estimator $\hat{g}_*(\mathbf{X}) \equiv \mathbb{I}(X_1=1)$ one can confirm that,
$$\mathbb{E}(\hat{g}_*(\mathbf{X})) = \mathbb{E}(\mathbb{I}(X_1=1)) = \mathbb{P}(X_1=1) = \exp(-2\theta) = g(\theta),$$
so this is indeed an unbiased estimator. Hence, the unique UMVUE obtained from the Rao-Blackwell technique is:
$$\begin{equation} \begin{aligned} \hat{g}(\mathbf{X}) &\equiv \mathbb{E}(\mathbb{I}(X_1=1) | T(\mathbf{X}) = t) \\[6pt] &= \mathbb{P}(X_1=1 | T(\mathbf{X}) = t) \\[6pt] &= \mathbb{P} \Big( X_1=1 \Big| \sum_{i=1}^n X_i = t \Big) \\[6pt] &= \frac{\mathbb{P} \Big( X_1=1 \Big) \mathbb{P} \Big( \sum_{i=2}^n X_i = t-1 \Big)}{\mathbb{P} \Big( \sum_{i=1}^n X_i = t \Big)} \\[6pt] &= \frac{\text{Pois}(1| \theta) \cdot \text{Pois}(t-1| (n-1)\theta)}{\text{Pois}(t| n\theta)} \\[6pt] &= \frac{t!}{(t-1)!} \cdot \frac{ \exp(-2\theta) \cdot (\exp(-(n-1)2\theta)}{\exp(-2n\theta)} \\[6pt] &= t \\[6pt] \end{aligned} \end{equation}$$
This doesnt' seem right. Where is the process going wrong? What would be the answer?