0

enter image description here

I am confused on this problem. My professor gave this as the solution:

$S_{N_{T}}$ is the time of the last arrival in $[0, t]$. For $0 < x \leq t, P(S_{N_{T}} \leq x) \sum_{k=0}^{\infty} P(S_{N_{T}} \leq x | N_{T}=k)P(N_{T}=k) $

$= \sum_{k=0}^{\infty} P(S_{N_{T}} \leq x | N_{T}=k) * \frac{e^{- \lambda t}*(\lambda t)^k}{k!}$.

Let $M=max(S_1, S_2, ..., S_k)$ where $S_i$ is i.i.d. for $i = 1,2,.., k$ and $S_i $~ Uniform$[0,t]$.

So, $P(S_{N_{T}} \leq x) = \sum_{k=0}^{\infty} P(M \leq x)\frac{e^{- \lambda t}*(\lambda t)^k}{k!} = \sum_{k=0}^{\infty} (\frac{x}{t})^k \frac{e^{- \lambda t}*(\lambda t)^k}{k!} = e^{- \lambda t} \sum_{k=0}^{\infty} \frac{(\lambda t)^k}{k!} = e^{- \lambda t}e^{- \lambda x} = e^{\lambda(x-t)}$

If $N_t = 0$, then $S_{N_{T}} = S_0 =0$. This occurs with probability $P(N_t = 0) = e^{- \lambda t}$.

Therefore, the cdf of $S_{N_{T}}$ is: $P(S_{N_{T}} \leq x) = \begin{array}{cc} \{ & \begin{array}{cc} 0 & x < 0 \\ e^{- \lambda (x-t)} & 0\leq x\leq t \\ 1 & x \geq t \end{array} \end{array}$

I don't really understand the part of creating the variable M of the maximum of k i.i.d. random variables in order to solve the problem. Any help would be greatly appreciated, thank you!

Yves
  • 4,313
  • 1
  • 13
  • 34
Emily
  • 1

1 Answers1

0

In the future please be more careful when asking your question. The screenshot of the problem is missing so much context; how do you expect others to know how $N_t$ and $S_{N_t}$ are defined? Furthermore there are careless typos throughout your professor's solution (writing $T$ sometimes instead of $t$, missing equals signs, misplaced parentheses, mathematical typos like writing $(\lambda t)^k$ instead of $(\lambda x)^k$, and $e^{-\lambda t} e^{-\lambda x} = e^{\lambda (x-t)}$, etc.). I guess if these were transcriptions of your professor's handwritten notes these things might happen, but you should try your best to catch these things when studying.


I will assume that you have a Poisson process with rate $\lambda$, and $N_t$ is defined as the number of arrivals in time interval $[0, t]$, and $S_i$ is the time of the $i$th arrival.

$$P(S_{N_t} \le x) = \sum_{k=0}^\infty P(S_{N_t} \le x \mid N_t = k) P(N_t = k).$$ We know $P(N_t = k) = e^{-\lambda t} (\lambda t)^k/k!$.

The other term is the probability that the last arrival in $[0, t]$ happens before time $x$, given that there are $k$ arrivals in $[0, t]$. Your professor uses a nontrivial result about Poisson processes to compute this term.

Conditioned on the event $N_t = k$ (i.e. there are $k$ arrivals in $[0, t]$), the distribution of arrival times $(S_1, S_2, \ldots, S_k)$ are the order statistics of $k$ i.i.d. $\text{Uniform}[0,t]$ random variables.

See here for a reference, although if your professor is using this result, you may have already encountered it in class somewhere.

In particular, conditioned on $N_t = k$, the last arrival $S_{N_t}$ has the same distribution as the maximum of $k$ i.i.d. $\text{Uniform}[0,t]$ random variables $U_1, \ldots, U_k$. (I use $U_i \sim \text{Uniform}[0,t]$ to not conflate with the $S_i$ which are already defined as the $i$th arrival in the Poisson process.)

So, the term $P(S_{N_t} \le x \mid N_t = k) = P(M \le x)$ where $M = \max\{U_1, \ldots, U_k\}$.

Thus, $$P(S_{N_t} \le x) = \sum_{k=0}^\infty (x/t)^k e^{-\lambda t} (\lambda t)^k/k! = e^{-\lambda t} \sum_{k=0}^\infty (x \lambda)^k/k! = e^{-\lambda t} e^{x \lambda}.$$

angryavian
  • 1,733
  • 12
  • 11