I have an event having poisson distribution with time intervals of one minute. Every event has accomplishment time with gamma distribution. I $N$ number of events start in $t$ minutes, the what will be the distribution of total time of all the $N$ events? Let $X_i$ be the time for every event $i$ the total time is $N$ summations of gamma distribution. But I am not able to solve ahead. Note: all the gamma distributions have same parameters the time $t$ is the time in which $N$ events started. It is not necessary that these events/tasks would end in time $t$ too.

- 63,378
- 26
- 142
- 467

- 131
- 3
-
2What does "total time" mean - the total of the accomplishment times, or the time at which the last accomplishment is complete, or... – jbowman Sep 16 '18 at 14:33
-
@jbowman the total time in which all the events are completed – Javeria Habib Sep 16 '18 at 14:51
-
1@JaveriaHabib jbowman asks probably because if the happen simultaneously, the "total" time will just be the maximum time length. If they come one after another, then, "total" actually means total. If it's the first one, I have to change my answer. – Taylor Sep 16 '18 at 15:07
-
-1 I understand the title but the body text is very hard to read. – Sextus Empiricus Sep 16 '18 at 15:17
-
@Taylor it does not happen simultaneously. Total time can be more than the time in which N events start – Javeria Habib Sep 16 '18 at 15:18
-
@MartijnWeterings I am sorry, I am not very expert in writing statements such way. Do tell me if you find any ambiguity – Javeria Habib Sep 16 '18 at 15:19
-
@JaveriaHabib I could try to correct it with my vague interpretation but I would prefer you did this yourself since I might misunderstand your intentions. It is mostly grammar that confuses me. To name an example *"I N number of events start in t minutes, the what will be the distribution of total time of all the N events"* is not logical sentence at all, if it is even a correct sentence. – Sextus Empiricus Sep 16 '18 at 15:58
-
1as a start, (1) what does " time intervals of one minute" mean? (2) how does an event *have* a Poisson distribution (what does that mean) (3) you seem to have a 'single event' (the first sentence is singular 'I have an event...') but as well 'multiple events' ('every event has...') could you explain this. (4) What does "I N number of events start in t minutes" mean? (5) What does "the total time of all the N events" mean (is it the sum of all individual times or is it something else? how is the starting time $t$ involved?). – Sextus Empiricus Sep 16 '18 at 16:11
-
To clarify my question, let us assume that $t = 2$, the first Poisson arrival occurs at time 0.5, and the second at time 1. If $X_1 = 2$ and $X_2 = 2$, the sum of the accomplishment times is 4, but the time at which everything is "done" is 3. Which time is the one you are referring to? – jbowman Sep 16 '18 at 17:53
2 Answers
So $N$, the total number of events follows a Poisson distribution, which means $$ p(n) = P(N=n) = \frac{e^{-\lambda}\lambda^n}{n!}. $$ Also, they tell you the conditional distribution of the sum $T = \sum_{i=1}^N X_i$: $$ f_{T\mid N=n}(t\mid n) = \frac{1}{\Gamma(n \alpha )\beta^{n\alpha} }\exp\left[ - \frac{t}{\beta}\right]t^{n\alpha-1} . $$ I assume they're all iid, here.
If you want the joint distribution of both random variables, just multiply together. If you want the marginal of $T$, you have to sum out the unwanted $n$ from the joint:
$$ f_T(t) = \sum_{n=0}^{\infty}f_{T\mid N=n}(t\mid n)p(n) = \sum_{n=0}^{\infty} \underbrace{\frac{e^{-\lambda}\lambda^n}{n!}}_{P(N=n)}\underbrace{\frac{1}{\Gamma(n \alpha )\beta^{n\alpha} }\exp\left[ - \frac{t}{\beta}\right]t^{n\alpha-1}}_{f_{T\vert N=n}(t\vert n)}. $$ Does that help?

- 43,080
- 1
- 72
- 161

- 18,278
- 2
- 31
- 66
-
Why are we using nα when we are multiplying the distribution with that of poisson? – Javeria Habib Sep 16 '18 at 14:58
-
1I am using the fact that if $X_1, X_2, \ldots, X_n \overset{iid}{\sim} \text{Gamma}(\alpha, \beta)$, then $\sum_i X_i \sim \text{Gamma}(n \alpha, \beta)$, but this is for a fixed number of summands – Taylor Sep 16 '18 at 15:05
-
1That's the main issue, the summand N is a random number from a poisson distribution. – Javeria Habib Sep 16 '18 at 15:21
-
2@JaveriaHabib you get: $$f_{\text{total}}(t) = \sum_{n \text{ in all possible N}} f_{\text{sum of n gamma}}(t) P_{\text{Poisson}}(n)$$ In words: the density to get time $t$ is the sum of the density for all different situations $n$ (multiplied by the probability for each $n$ ). – Sextus Empiricus Sep 16 '18 at 16:24
-
In addition to the other excellent answer, I will try another approach. Such problems are often attacked with moment generating functions.
Let $X_1, \dotsc, X_n$ be iid and $N \sim \text{Po}(\lambda)$. We are interested in the sum of a random number of terms, $X_1+\dotsm+X_N$. Let the moment generating function (mgf) of $X_1$ be $M_X(t)$ and then define $M_n(t)=M_{X_1+\dotsm+X_n}(t)= M_X(t)^n$. Then we can calculate the mgf of the sum of a random number of terms as $$\DeclareMathOperator{\E}{\mathbb{E}} M(t)=M_{X_1+\dotsm+X_N}(t)=\E \E \left\{ e^{t(X_1+\dotsm+X_N} \mid N=n\right\} \\ = \sum_{n=0}^\infty M(t)^n e^{-\lambda} \frac{\lambda^n}{n!}=\sum_{n=0}^\infty e^{-\lambda}\frac{(\lambda M(t))^n}{n!} = e^{-\lambda} e^{\lambda M(t)} $$ and then checking that the mgf for the gamma distribution is $$ M_X(t)= (1- t/\beta)^{-\alpha} $$ (for $t< \beta$.) Inserting this we finally find that the mgf of the random sum is $$ M(t)= e^{-\lambda} e^{\lambda (1-t/\beta)^{-\alpha}} $$ I cannot recognize that as the mgf of a known distribution, but one can get good approximations starting with the mgf. One possibility is the saddlepoint approximation, see How does saddlepoint approximation work? (and search this site for examples).

- 63,378
- 26
- 142
- 467