3

Suppose $X_{i}$ for $i=1,2,..,n$ are independent random variables. Let their distribution be exponential with parameter $\lambda_{i}$ respectively. and $Z=min(X_{1},X_{2},....X_{n})$ and J= j where j is the index of random variable which is equal to Z . How can we prove that Z and J are independent ?

My try at the solution

TO prove $J$ and $Z$ are independent it will suffice to prove $P(J,Z)=P(J)P(Z)$

I start by trying to find the joint probability of $J=k$ and $Z\geq t$ , So I have $$P(J=k, Z\geq t)=P(J=k|Z\geq t)*P(Z\geq t)$$ But Z is the minimum of all the random variables so every random variable $X_{i}$ is greater than or equal to $t$ which gives us $$P(Z\geq t)=P(X_{1}\geq t, X_{2}\geq t,......,X_{n}\geq t)$$

And since $Z_{i}$ are independent random variable we have

$$P(Z\geq t)=P(X_{1}\geq t)P(X_{2}\geq t)...P(X_{n}\geq t)$$

I can't seem to figure out how should I proceed next. I will have to particularly play with the conditional term and see if I could make it just $P(J=k)$ or maybe I should plug in the probabilities of exponential random variables in the last equation. Can anybody help, give me any hints ?

Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161
redenzione11
  • 483
  • 1
  • 3
  • 12
  • "But Z is the minimum of all the random variables so every random variable Xi is greater than or equal to t..." I believe this is an error. $t$ is totally arbitrary, it could very well be larger than the minimum. – Matthew Drury Oct 05 '17 at 01:58
  • @MatthewDrury Yeah t is arbitrary but how does that effect the subsequent equations ? Those will be the same –  redenzione11 Oct 05 '17 at 02:03
  • Oh shoot, you're right. My bad, I confused myself. – Matthew Drury Oct 05 '17 at 02:04
  • We can then plug in values for each $P(X_{i}\geq t)$ and multiply them and that gives us another exponential distribution for $P(Z\geq t)$ . Do you have any ideas about how I should proceed. –  redenzione11 Oct 05 '17 at 02:15
  • 1
    The $X_i$ are waiting times for Poisson$(\lambda_i)$ processes. Their minimum is the waiting time to the first of *any* of those processes. Another way to describe this is to begin with Poisson$(\lambda)$ process and independently assign each point randomly to one of the $n$ indexes $1\ldots n$ with probability $\lambda_i/\lambda$. This creates a set of $n$ independent Poisson processes. Their intensities are $\lambda_i/\lambda\times \lambda=\lambda_i$. Because the assignments are independent of the points, the assignment of the first point $Z$ is independent of the minimum waiting time $X$. – whuber Oct 05 '17 at 15:46
  • A detailed explanation of all the foregoing concepts and results appears at https://stats.stackexchange.com/a/215253/919. – whuber Oct 05 '17 at 15:48

1 Answers1

6

joint probability $Z=z$ and $J=j$

$$\begin{align} \\P(Z=z , J=j) &= f_j(Z) \prod_{i \neq j} 1-F_i(z)\\& = \lambda_je^{-\lambda_j z} \prod_{i \neq j} 1 - (1-e^{-\lambda_i z}) \\&= \lambda_j \prod_{i} e^{-\lambda_i z}\\&=\lambda_j e^{- \left( \sum_{i}\lambda_i \right) z} \end{align}$$

joint probability $X_j=x$ and $J=j$ works the same

$$\begin{align} \\P(X_j=x , J=j) &= \lambda_j e^{- \left( \sum_{i}\lambda_i \right) x}\\ &\phantom{=\lambda_je^{-\lambda_j z} \prod_{i \neq j} 1 - (1-e^{-\lambda_i z})} \end{align}$$

we can find the probability for $J=j$ by integrating the previous result

$$\begin{align} P(J=j) &=\int_0^\infty P(X_j=x,J=j) dx \\ &= \frac{\lambda_j}{-\sum_{i}\lambda_i } e^{- \left( \sum_{i}\lambda_i \right) x} \Big|_0^\infty\ \\ &= \frac{\lambda_j}{\sum_{i}\lambda_i } \end{align} $$

Thus $P(Z=z \vert J=j) = \frac{P(Z=z , J=j)}{P(J=j)} $ is constant. Namely, we get:

$$P(Z=z \vert J=j) = \left( \sum_{i}\lambda_i \right) e^{- \left( \sum_{i}\lambda_i \right) z} $$

and the conditional distribution is independent from the index $j$


checking by computation:

# lamda_i
lambda = c(1,2,3)

# sampling
n <- 100000
sample <- sapply(lambda,function(l) rexp(n,l))

# descriptive statistics
min <- apply(sample,1,min)
id  <- apply(sample,1,function(x) which.min(x))

# observations
j=1
plot(hist(min[which(id==j)],breaks=seq(0,max(min),0.05)),col=2)

# plotting model
x <- seq(0,4,0.05)
lines(x,0.05*length(which(id==j))*sum(lambda)*exp(-sum(lambda)*x))

which seems to work well

Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161
  • How did $P(J=j|Z=z)$ become $f_{j}(Z)$ ? –  redenzione11 Oct 05 '17 at 14:42
  • If j is the lowest number and has the value Z... then the probability is the probability for the j-th value being Z (thus $f_j$) multiplied with the probability that the other variables are larger (the $1-F_i$ terms with $i \neq j$). – Sextus Empiricus Oct 05 '17 at 18:14