6

Suppose that $X_t$ is the set of times of the events of a Poisson process with unit rate after $t$ seconds. (In other words, $X_t$ is a set of $N$ uniformly distributed points over $[0,t]$ where $N$ is Poisson distributed with mean $t$.) Let $Y_t = \sum_{s \in X_t}e^{-(t-s)}$. What is the distribution of $Y_t$ as $t\to\infty$?


Just because people often ask what I've tried although I couldn't make it work: I let $A$ be the desired distribution and then considered that after a tiny interval of time $\delta$, $A$ decays to $e^{-\delta}A$; and it jumps to $A+1$ with probability $\delta$. I reasoned that the resulting mixture should equal $A$ since $A$ is the stationary distribution.

From there, without justification, I figured that the density of $A$, $f$ satisfies something like:

$$\lim_{\delta\to0}\frac{f(x) - (1-\delta)f(e^\delta x) - \delta f(e^\delta x-1)}{\delta}=0$$


We could divide the time before $t$ into small intervals of length $\frac1k$ and then each contributes a small amount to $A$. In the limit as $k\to\infty$, this should be the expected value of $A$: \begin{align} E(A) &= \lim_{k\to\infty}\sum_{i\ge0} \frac{e^{-\frac{i}k}}k \\ &= \lim_{k\to\infty}\frac{1}{k}\sum_{i\ge0} \left(e^{-\frac{1}k}\right)^i \\ &= \lim_{k\to\infty}\frac{\frac1k}{\left(1-e^{-\frac{1}k}\right)} \\ &= \lim_{k\to\infty}\frac{\frac1{k^2}}{\frac1{k^2}\left(e^{-\frac{1}k}\right)} \\ &= \lim_{k\to\infty}e^{\frac{1}k} \\ &= 1 \end{align} By the same argument, the variance should also be 1 since the variance of each small interval is also proportional to the interval length. Therefore, $A$ seems to be some continuous analog of the Poisson distribution, which has distribution function of the form $$F(x;\lambda) = \begin{cases}0&x\le 0,\\ \frac{\Gamma(x, \lambda)}{\Gamma(x)}&x>0\end{cases}$$ according to Ilienko, A., & Klesov, O. I. (2013). Continuous counterparts of poisson and binomial distributions and their properties, 39, 137–147.

Neil G
  • 13,633
  • 3
  • 41
  • 84
  • Are you really fixing $N$? I ask because this makes the answer almost trivial. Since $X_t$ is just a set of random times in $[0,1]$ that has been scaled by $t$, $Y_t$ obviously decays exponentially in $t$. For interesting asymptotics, then, don't you want $N=N(t)$ to vary with $t$ so that $N(t)/t$ has a nonzero limit? – whuber Dec 10 '15 at 21:41
  • 2
    @whuber: $N$ is Poisson-distributed with mean $t$, so it does vary with $t$. How should I clarify? $\lim_{t\to\infty} \frac{N(t)}{t} = 1$ as you want. – Neil G Dec 10 '15 at 22:13

1 Answers1

7

The answer may be surprising. Here is a brief sketch of a solution. As with a somewhat related problem, the idea is to obtain a recurrence relation for a quantity related to the asymptotic distribution and then solve that relation.


Since all uniform distributions are symmetric, the sum can equally well be expressed by changing each $s$ into $t-s$, producing an expression with the same distribution as $Y_t$,

$$Y^\prime_t = \sum_{s\in X_t} e^{-s}.$$

Reorder this sum (which, because it is almost surely finite, will not change its value) so that the $s = s_1 \le s_2 \le \cdots \le s_N$ are ascending. Letting $u_1 = s_1,$ $u_2 = s_2 - s_1, \ldots,$ $u_{i+1} = s_{i+1}-s_i$ for $i=1, 2, \ldots, N-1$ enables us to rewrite $u_i = s_1 + s_2 + \cdots + s_i$, thereby putting this sum into the form

$$\eqalign{ Y^\prime_t &= \sum_{i=1}^N e^{-(u_1 + u_2 + \cdots + u_i)} \\ &=\sum_{i=1}^N e^{-u_1}e^{-u_2}\cdots e^{-u_i}\\ &=e^{-u_1}\left(1 + \sum_{i=2}^N e^{-u_2}\cdots e^{-u_i}\right)\\ &=\cdots\\ &= e^{-u_1}\left(1 + e^{-u_2}\left(1 + \cdots + \left(1 + e^{-u_N}\right)\cdots \right)\right). }$$

If we were to fix $N$ (rather than let it have a Poisson distribution), the exponentials of the gaps $U_i=e^{-u_i}$ would be uniformly distributed. In the limit there is no difference between fixing $N$ and allowing it to have a Poisson distribution (up to a factor of $N^{-1/2}$), so the asymptotic distribution of $Y_t$ must be that of the infinite product

$$U_1(1 + U_2(1 + U_3(1 + \cdots )\cdots))$$

where the $U_i$ are independently uniformly distributed on $[0,1]$. Therefore, if the random variable $X$ has the limiting distribution of $Y_t$, then $U(1+X)$ must also have this distribution for an independent uniform variate $U$. This is the desired recurrence relation. It remains to exploit it.

Let $X$ be a random variable with the limiting distribution function $F$ (of density $f$) and let $y \ge 0$. By definition,

$$F(y) = \Pr(U(1+X) \le y) = \Pr(U \le \frac{y}{1+X}).$$

This breaks into two parts depending on whether $y/(1+X)$ exceeds $1$, for when it is less than $1$ this probability equals $y/(1+X)$ and otherwise it equals $1$. We need to integrate this probability over all possible values of $X$, which can range from $0$ on up. This yields

$$F(y) = y \int_{\max(y-1,0)}^\infty \frac{f(x)}{1+x}dx + F(y-1).$$

For $0 \le y \le 1$, this shows $F$ is linear in $y$, whence $f$ is a constant in this interval. Let's rescale $F$ to make this constant unity--we'll renormalize afterwards. Thus, using the rescaled $F$, we obtain the recurrence relation

$$F(y) - F(y-1) = y \int_{\max(y-1,0)}^\infty \frac{f(x)}{1+x}dx.$$

By differentiating both sides with respect to $y$ we obtain a comparable recurrence for the density. It can be simplified to

$$f(y) = 1 - \int_1^y \frac{f(x-1)dx}{x}.$$

The solution breaks into functions defined piecewise on the intervals $[0,1]$, $[1,2]$, and so on. It can be obtained by starting with a constant value of $f$ on $[0,1]$, integrating the right hand side of the recurrence to obtain the values of $f$ on $[1,2]$, and repeating ad infinitum. The resulting function on $[0,\infty)$ is then integrated to find the normalizing constant. Only the first few expressions can be integrated exactly in any nice form--repeated logarithmic integrals show up immediately.

Here is a histogram of a simulation of the 100th partial product, iterated 10,000 times. Over it is plotted an approximation of $f$ obtained as described above using numerical integration. It closely agrees with the simulation.

![Figure


These calculations were carried out in Mathematica (due to the need for repeated numerical integration). A quick simulation of the problem--as originally formulated--can be performed in R as a check. This example performs 100,000 iterations, drawing a Poisson $N$, conditionally drawing the $X_t$, and computing $Y_t$ each time:

s <- 100 # Plays the role of "t" in the question
sim <- replicate(1e5, sum(exp(-s+runif(rpois(1, s), 0, s))))
hist(sim, freq=FALSE, breaks=50)
whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • Awesome. I'm at a conference now and can't look at it in detail, but this looks great. – Neil G Dec 12 '15 at 21:23
  • Is there a minor error in your definition of $Y'_t$? Should the exponent start at index $i$? – Neil G Dec 26 '15 at 06:15
  • Also what is $X$? – Neil G Dec 26 '15 at 06:16
  • @NeilG Thank you for reading this post carefully! I have made edits to (a) explicitly define $X$ as a variable having $F$ for its distribution and (b) to end the sum of exponents at $i$ instead of $N$ in the formula for $Y_t^\prime$. – whuber Dec 26 '15 at 14:03