The zero-truncated poisson distribution has probability mass function:
$$P(X=k) = \frac{e^{-\lambda}\lambda^k}{(1-e^{-\lambda})k!}$$, $k=1,2,...$
And the expectation of the truncated Poisson distribution via MLE is given as $\frac{\lambda}{(1-e^{-\lambda})}$ According to this document (pages 19-22) the Fisher Information is given by
$$I(\theta) = \frac{n}{(1-e^{-\lambda})}\left[\frac{1}{\lambda}-\frac{e^{-\lambda}}{(1-e^{-\lambda})}\right]$$
How is this derived?
===========================================================================
Here is my best attempt thus far (but still the wrong answer at the end):
The likelihood is given by:
$$L(\lambda) = \prod_{n=1}^n\frac{\lambda^{x_i}e^{-\lambda}}{x_i!(1-e^{-\lambda})}$$
and thus the log-likelihood is:
$$l(\lambda)=-\ln(x_i!)+\ln(\lambda)\sum_{i=1}^nx_i-n\lambda-n\ln(1-e^{-\lambda})$$
Differentiating with respect to $\lambda$ we get the first derivative:
$$l'(\lambda)=-n-n\ln(1-e^{-\lambda})+\frac{1}{\lambda}\sum_{i=1}^nx_i$$
and the second derivative with respect to $\lambda$ is:
$$l^"(\lambda)=n\frac{e^{-\lambda}}{(1-e^{-\lambda})^2}-\frac{1}{\lambda^2}\sum_{i=1}^nx_i$$
The Fisher Information is given by
$$I(\lambda)=E[-l^"(\lambda)|\lambda]=E\left[-\left(n\frac{e^{-\lambda}}{(1-e^{-\lambda})^2}-\frac{1}{\lambda^2}\sum_{i=1}^nx_i\right)|\lambda\right]$$
$$\Rightarrow -n\frac{e^{-\lambda}}{(1-e^{-\lambda})^2}+\frac{1}{\lambda^2}\sum_{i=1}^nE[x_i|\lambda]$$
Which yields my incorrect Fisher Information:
$$I(\lambda)=-n\frac{e^{-\lambda}}{(1-e^{-\lambda})^2}+\frac{n\lambda}{\lambda^2}=n\left(\frac{1}{\lambda}-\frac{e^{-\lambda}}{(1-e^{-\lambda})^2}\right)$$
What have I done wrong with the expectation?