2

Can we simplify

$$E[K^{-1}] = \frac{e^{-\lambda}}{1-e^{-\lambda}} \sum_{k=1}^\infty k^{-1} \frac{\lambda^k}{k! }$$

to compute or estimate $E[K^{-1}]$ when $K$ is a zero truncated Poisson or binomial distribution?

Related/original problem is a sort of compound distribution with the mean of a sample where the sample size $k$ follows a binomial or Poisson distribution (and the variance scales like $k^{-1}$). A related question where this occurs is: Ratio of two weighted sum of Bernoulli random variables

Another relation is: https://mathoverflow.net/questions/285586/fractional-moments-of-poisson-distribution

Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161
  • Another relation is https://stats.stackexchange.com/questions/423149/compounding-a-gaussian-distribution-with-variance-distributed-according-to-the-a in which the answer shows that compounding the variance relates to a product/ratio distribution. (So if we can find this compound distribution, and it's mean, for the related question with the compound distribution that inspired this question, then we have an expression for the answer here) – Sextus Empiricus Mar 23 '21 at 08:47
  • 1
    https://www.wolframalpha.com/input/?i=%5Cfrac%7Be%5E%7B-%5Clambda%7D%7D%7B1-e%5E%7B-%5Clambda%7D%7D+%5Csum_%7Bk%3D1%7D%5E%5Cinfty+k%5E%7B-1%7D+%5Cfrac%7B%5Clambda%5Ek%7D%7Bk%21+%7D – Christoph Hanck Mar 23 '21 at 11:43
  • 1
    @Christoph This is lo mentioned in Jones and Zhigljavsky *"Note that a number of alternative approximations to $μ_{−1}$ can be constructed through using the fact that $\mu_{-1} = \frac{e^{-\lambda}}{1-e^{-\lambda}} (Ei(\lambda)-\log \lambda - \gamma)$"* – Sextus Empiricus Mar 23 '21 at 12:36

1 Answers1

1

An approximation (as well as a short summary of the history of the problem) is given in Jones and Zhigljavsky (2004) "Approximating the negative moments of the Poisson distribution" Statistics & Probability Letters, Volume 66, Issue 2

$\mu_{-\alpha} \cong \mu_{-\alpha}^{(k)} = \sum_{u = \alpha}^k \frac{s(u,\alpha)}{\lambda^u} $

where the $s(u,\alpha)$ are Stirling numbers of the first kind.

For the case $\alpha = 1$ this can be written by replacing the Stirling numbers with a factorial.

$\mu_{-1} \cong \mu_{-1}^{(k)} = \sum_{u = 1}^k \frac{s(u,1)}{\lambda^u} = \sum_{u=1}^k \frac{(u-1)!}{\lambda^u} $

Note that this is not an approximation that improves with increasing the order as $k \to \infty$.


In 1964 Tiku gave the following approximation in 'A Note on the Negative Moments of a Truncated Poisson Variate' JASA Vol. 59, No. 308

$$M(1) \approx \frac{1}{1-e^{-\lambda}} \frac{1}{\lambda -1} \cdot \left({1+\sum_{r=3}^j \beta_r}\right)$$

where the $\beta_r$ are coefficients determined by a Laguerre polynomials expansion and matching the coefficients such that the moments match. We have $\beta_r = \frac{a^{(r)}}{\lambda (\lambda-1) \cdots (\lambda-r)}$ (In Tiku's article this seems to be a typo) and $a^{(3)} = 1, a^{(4)} = 7, a^{(5)} = 43, a^{(6)} = 271, a^{(7)} = 1811$


The result in the comment by Christoph Hanck, relating to Wolram Alpha, is also mentioned in Jones and Zhigljavsky

Note that a number of alternative approximations to $μ_{−1}$ can be constructed through using the fact that $\mu_{-1} = \frac{e^{-\lambda}}{1-e^{-\lambda}} (Ei(\lambda)-\log \lambda - \gamma)$

R-code to try out

lambda = 8
k = 1:1000
pp <- dpois(k,lambda)/(1-exp(-lambda))

### compute E(1/k) with brute force
sum(pp/k)

### compute E(1/k) with less force
ar = c(1,7,43,271,1811)
ff = cumprod(lambda-c(0:6))
br = ar/ff[-c(1:2)]
result = (1/(1-exp(-lambda)))*(1/(lambda-1))*(1+cumsum(br))

### compute with integral 
exp(-lambda)/(1-exp(-lambda))*(expint::expint_Ei(lambda)-log(lambda)-0.57721)
Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161