1

I have this doubt:

Consider $X$~$N(\mu_X,\sigma^2)$, $Y$~$N(\mu_Y,\sigma^2)$ and $Z=X-Y$

I know that $E(Z)=E(X)-E(Y)=\mu_X-\mu_Y$ because the expected value is a linear operator.

And I know that $V(Z)=V(X)-V(Y)=0$

But which distribution does $Z$ have? Can I say that $Z$ has a normal distribution because $X$ and $Y$ have a normal distribution?

  • I believe the question you are really asking concerns how to compute variances of differences of (independent) random variables. That has good answers at http://stats.stackexchange.com/questions/26886. – whuber Jun 23 '15 at 20:16
  • That's true too. And I just assumed that $V(X-Y)=V(X)-V(Y)$ which isn't right. Thank you for your advice. – Élio Pereira Jun 23 '15 at 20:21

2 Answers2

2

The variance of Z is equal to V(X) + V(Y) if X and Y are independent (You have to add the correlation term if X and Y are not independent) because V(-X) = V(X). Thus, Z has a normal distribution with $\mu_Z = \mu_X - \mu_Y$ and $\sigma_Z^2 = 2\sigma^2$.

bassir
  • 148
  • 10
1

You should unaccept bassir's answer because it only addresses the case in which X and Y are independent, and his statement that Z is Normal may not be true if X and Y are not independent.

If X and Y are independent, then EZ and Var(Z) are as stated by bassir.

If X and Y are jointly Normal (i.e., Bivariate Normal), but not necessarily independent, then Z is a Normal random variable, and EZ is the same as for independent case, and Var(Z) = 2*σ^2 -2*Cov(X,Y) = 2*σ^2 - 2*(correlation coefficient between X and Y) * σ^2, where Cov(X,Y) is the covariance between X and Y.

If X any Y are not independent, Z does not necessarily even have a Normal distribution. EZ and Var(Z) are per the immediately above case which assumes X and Y are jointly Normal, but in fact, Z may not be Normal. There are several such examples in http://www.amazon.com/Introduction-Probability-Theory-Applications-Vol/dp/0471257095 .

Mark L. Stone
  • 12,546
  • 1
  • 31
  • 51
  • +1 There are examples in answers here on CV also (up to a flip of sign on the second variate to convert a sum to a difference). e.g there's some [here](http://stats.stackexchange.com/questions/120861/example-of-two-correlated-normal-variables-whose-sum-is-not-normal/120900#120900); and another [here](http://stats.stackexchange.com/questions/125648/transformation-chi-squared-to-normal-distribution/125653#125653). Nice images [here](http://stats.stackexchange.com/questions/30159/is-it-possible-to-have-a-pair-of-gaussian-random-variables-for-which-the-joint-d) of bivariates with the property. – Glen_b Jun 24 '15 at 01:06