2

I would like to sample from an infinite discrete distribution, i.e. Poisson distribution \begin{align} \Bbb P(X=k)=e^{-\lambda}\frac{\lambda^k}{k!} \end{align} where $\lambda$ is a fixed parameter.

In case of a finite distribution I could use ALIAS method, but it doesn't work in this case. I've found that I could use a transformation \begin{align} X=\min\{ k: U \leq \sum\limits_{i=1}^n p_i \} \end{align} but is that the way to go?

thesecond
  • 229
  • 1
  • 6
  • 1
    With arbitrary accuracy, which you may select, you *can* use the alias method. Just truncate the distribution at, say, its $1-10^{-16}$ quantile. Another approach is to sample from a mixture of a more severely truncated version of the distribution and the remaining tail (recursively). The recursion has probability $1$ of terminating (and will terminate rapidly). – whuber Nov 09 '21 at 15:16

2 Answers2

5

There is a quite surprising way to sample from the Poisson distribution with any $\lambda$ parameter. The basis for this is the following algorithm to sample a Poisson variate with $\lambda = 1$, which involves only integer arithmetic and no floating-point operations (Duchon and Duvignau, "Preserving the number of cycles of length k in a growing uniform permutation", Electronic Journal of Combinatorics 23(4), 2016):

  1. Set ret to 1, a to 1, and b to 0.
  2. Generate $j$, a uniform random integer in [0, a].
  3. If $j<a$ and $j<b$, return ret.
  4. If $j=a$, add 1 to ret. Otherwise, subtract 1 from ret and set b to $1+a$.
  5. Add 1 to a and go to step 1.

Poisson variates with other $\lambda$ parameters follow by splitting $\lambda$ into pieces of size no greater than 1. Then, for each piece, a Poisson variate with $\lambda$ equal to that piece is generated. Then all these variates are added together.

Finally, let $\mu < 1$, then to generate a Poisson variate with parameter $\mu$, it's enough to:

  1. Generate $N$, a Poisson variate with parameter 1.
  2. Flip $N$ coins that each show heads with probability $\mu$, then count the number of heads.
Peter O.
  • 863
  • 1
  • 4
  • 18
2

Yes, exactly. This is known as inverse transform sampling, and this earlier thread may be helpful.

Actually, you have a typo in your formula (note that there is only a single $k$ and a single $n$ in your formula). What you would do is to generate a value $p$ that is uniformly distributed in $[0,1]$, and then the resulting Poisson variable $k$ is the smallest $k$ such that $\sum_{i=0}^kp_i\geq p$. You would of course pre-tabulate your $p_i$ up to some sufficiently large number, so you don't need to recalculate them multiple times.

Stephan Kolassa
  • 95,027
  • 13
  • 197
  • 357