13

Let $X_1$ and $X_2$ be independent and identically distributed exponential random variables with rate $\lambda$. Let $S_2 = X_1 + X_2$.

Q: Show that $S_2$ has PDF $f_{S_2}(x) = \lambda^2 x \text{e}^{-\lambda x},\, x\ge 0$.

Note that if events occurred according to a Poisson Process (PP) with rate $\lambda$, $S_2$ would represent the time of the 2nd event.

Alternate approaches are appreciated. The approaches provided are commonly used when learning queueing theory & stochastic processes.


Recall the Exponential distribution is a special case of the Gamma distribution (with shape parameter $1$). I've learned there is a more general version of this here that can be applied.

SecretAgentMan
  • 1,463
  • 10
  • 30
  • 1
    This question is a very special case (and one of the simplest possible examples) of a sum of Gamma distributions. (The Exponential is a Gamma distribution with a shape parameter of $1.$) Thus, you could apply any of the answers at https://stats.stackexchange.com/questions/72479. – whuber Aug 12 '19 at 12:28
  • 1
    Thank you. I was unaware of [that more general question](https://stats.stackexchange.com/q/72479/177387), though I did know the Exponential is a Gamma distribution with a shape parameter of 1. I hope you'll agree this Q/A is ok as-is and shouldn't be deleted. This is a very frequent question in some engineering disciplines and is certainly more accessible than jumping straight into adding Gamma distributions. – SecretAgentMan Aug 12 '19 at 21:38
  • @whuber I've updated the question specifically mention the more general question. Thank you. – SecretAgentMan Aug 12 '19 at 21:40
  • 1
    For the reasons you gave, and because you have offered a clear account of solutions that work specifically in this case, I have not voted to close this as a duplicate. – whuber Aug 12 '19 at 21:46
  • @whuber Thank you. I'll go with whatever you & the community think best. It was a honest mistake -- I didn't see the other post. FWIW, I've updated the question a bit more to highlight some context for this approach (queueing theory & stochastic processes). – SecretAgentMan Aug 12 '19 at 21:48
  • 2
    I think the voting on your question and your answer has clearly indicated what the community thinks of this thread. :-) – whuber Aug 12 '19 at 21:52

1 Answers1

11

Conditioning Approach
Condition on the value of $X_1$. Start with the cumulative distribution function (CDF) for $S_2$.

$\begin{align} F_{S_2}(x) &= P(S_2\le x) \\ &= P(X_1 + X_2 \le x) \\ &= \int_0^\infty P(X_1+X_2\le x|X_1=x_1)f_{X_1}(x_1)dx_1 \\ &= \int_0^x P(X_1+X_2\le x|X_1=x_1)\lambda \text{e}^{-\lambda x_1}dx_1 \\ &= \int_0^x P(X_2 \le x - x_1)\lambda \text{e}^{-\lambda x_1}dx_1 \\ &= \int_0^x\left(1-\text{e}^{-\lambda(x-x_1)}\right)\lambda \text{e}^{-\lambda x_1}dx_1\\ &=(1-e^{-\lambda x}) - \lambda x e^{-\lambda x}\end{align} $

This is the CDF of the distribution. To get the PDF, differentiate with respect to $x$ (see here).

$$f_{S_2}(x) = \lambda^2 x \text{e}^{-\lambda x} \quad\square$$

This is an Erlang$(2,\lambda)$ distribution (see here).


General Approach
Direct integration relying on the independence of $X_1$ & $X_2$. Again, start with the cumulative distribution function (CDF) for $S_2$.

$\begin{align} F_{S_2}(x) &= P(S_2\le x) \\ &= P(X_1 + X_2 \le x) \\ &= P\left( (X_1,X_2)\in A \right) \quad \quad \text{(See figure below)}\\ &= \int\int_{(x_1,x_2)\in A} f_{X_1,X_2}(x_1,x_2)dx_1 dx_2 \\ &(\text{Joint distribution is the product of marginals by independence}) \\ &= \int_0^{x} \int_0^{x-x_{2}} f_{X_1}(x_1)f_{X_2}(x_2)dx_1 dx_2\\ &= \int_0^{x} \int_0^{x-x_{2}} \lambda \text{e}^{-\lambda x_1}\lambda \text{e}^{-\lambda x_2}dx_1 dx_2\\ \end{align}$

Since this is the CDF, differentiation gives the PDF, $f_{S_2}(x) = \lambda^2 x \text{e}^{-\lambda x} \quad\square$ Figure


MGF Approach
This approach uses the moment generating function (MGF).

$\begin{align} M_{S_2}(t) &= \text{E}\left[\text{e}^{t S_2}\right] \\ &= \text{E}\left[\text{e}^{t(X_1 + X_2)}\right] \\ &= \text{E}\left[\text{e}^{t X_1 + t X_2}\right] \\ &= \text{E}\left[\text{e}^{t X_1} \text{e}^{t X_2}\right] \\ &= \text{E}\left[\text{e}^{t X_1}\right]\text{E}\left[\text{e}^{t X_2}\right] \quad \text{(by independence)} \\ &= M_{X_1}(t)M_{X_2}(t) \\ &= \left(\frac{\lambda}{\lambda-t}\right)\left(\frac{\lambda}{\lambda-t}\right) \quad \quad t<\lambda\\ &= \frac{\lambda^2}{(\lambda-t)^2} \quad \quad t<\lambda \end{align}$

While this may not yield the PDF, once the MGF matches that of a known distribution, the PDF also known.

SecretAgentMan
  • 1,463
  • 10
  • 30
  • 4
    You wrote both the question and the answer. What is your point, if I may ask? – Xi'an Oct 14 '18 at 11:58
  • 9
    @Xi'an, I thought SE encouraged asking the question and answering it...I can screenshot where SE seems to encourage that for you if you want. I've seen a lot of basic questions repeatedly asked and I've been thinking about posting some specific approaches to refer people to. I wasn't able to find something like this and I can refer people to this for a variety of things. If the CV community really hates this post that much, I will voluntarily delete it. – SecretAgentMan Oct 15 '18 at 14:31
  • 3
    @Xi'an, Respectfully, I believe you both asked and answered a question [here](https://stats.stackexchange.com/questions/122917/when-if-ever-is-a-median-statistic-a-sufficient-statistic). – SecretAgentMan Oct 15 '18 at 17:24
  • 5
    @Xi'an You may wish to read https://stats.stackexchange.com/help/self-answer – Sycorax Oct 15 '18 at 17:37
  • I don't have enough karma to comment but i think this is a worthwhile comment so I post here. For the General Approach, can you show the steps (or provide a proof) going from the double integral to the final expression. I derived myself and get the same answer as wolfram. [Link to Wolfram Answer](https://www.wolframalpha.com/input/?i=double+integral&assumption=%7B%22F%22,+%22DoubleIntegral%22,+%22integrand%22%7D+-%3E%22%5Clambda+e%5E(-%5Clambda+g)+*+%5Clambda+*+e%5E(-%5Clambda+h)%22&assumption=%7B%22F%22,+%22DoubleIntegral%22,+%22intvariable1%22%7D+-%3E%22g%22&assumption=%7B%22F%22,+%22DoubleI – Avedis Nov 27 '18 at 21:04
  • @SecretAgentMan: about MGF approach: can you actually derive pdf from MGF rather than moments? – Alex May 15 '20 at 15:38
  • @SecretAgentMan: in (I) I'm a bit confused about the condition $X_1=x_1$, because $X_1$ is a continuous rv. Shouldn't it be $X_1 \leq x_1$? – Alex May 23 '20 at 17:01
  • 1
    @Alex good question. I wasn't thinking about getting PDF analytically from MGF. Instead, if you identify the MGF, then you've solved the problem (see my edit). – SecretAgentMan May 24 '20 at 12:59
  • @Alex you are wise to pay attention to such details. However, the integral is permitting the condition $X_1 = x_1$ without violating rules on continuous RVs since it is iterating across a step size of $dx$ (from a Riemann sum perspective). Since the integration is pushing $dx \rightarrow 0$ in a limit sense, I believe the equality is correct. If I altered the condition to be $X_1\le x_1$, I'd have to adjust the expression. – SecretAgentMan May 24 '20 at 13:02
  • @SecretAgentMan thanks, for $X_1$ I think it's my confusion. The expressions$P(X – Alex May 24 '20 at 19:12
  • @SecretAgentMan: can you also confirm that the condiitionality $P(X_1+X_2 – Alex May 24 '20 at 19:17