8

enter image description here

I have one question about this. I know that if we have $\mathrm{X}_1,\mathrm{X}_2,\ldots,\mathrm{X}_n$ independent and normally distributed random variables, then the sum $\mathrm{X}_1+\mathrm{X}_2+\ldots+\mathrm{X}_n$ has the normal distribution with mean $M_1+M_2+..+M_n$ and variance $\sigma^2_1 + \ldots + \sigma^2_n$.

Why is in this problem the difference $W-M$ the mean obtained by subtraction and variance obtained by addition? Thank you.

Macro
  • 40,561
  • 8
  • 143
  • 148
Andrew
  • 185
  • 1
  • 1
  • 6
  • 2
    If $M\sim N(68,3^2)$ then $-M\sim N(-68,3^2)$. Therefore $W-M=W+(-M)\sim N(65+(-68),1^2+3^2)$. This is, the values M1,M2,...Mn are not necessarily positive. I hope this helps. –  Apr 21 '12 at 14:54
  • @Procrastinator I thought of what you wrote a bit and it makes sense. Thank you – Andrew Apr 21 '12 at 15:09
  • 1
    If variances subtracted, then (for uncorrelated random variables) the variance of $X-Y$ would be negative whenever $\sigma_Y^2$ exceeded $\sigma_X^2$. The only time I have seen variances subtract is in the identity $$\operatorname{cov}(X+Y,X-Y) = \operatorname{var}(X) - \operatorname{var}(Y)$$ which applies to all random variables with finite variances, whether correlated or uncorrelated, dependent or independent, normal or abnormal etc. – Dilip Sarwate Apr 23 '12 at 15:46
  • @Andrew, this recent question answers this question in greater generality: http://stats.stackexchange.com/questions/31177/does-the-variance-of-a-sum-equal-the-sum-of-the-variances/31181#31181 – Macro Jun 29 '12 at 00:34

2 Answers2

12

If $X$ and $Y$ are independent random variables, then so are $X$ and $Z$ independent random variables where $Z = -Y$. Now, $$\text{var}(Z) = \text{var}(-Y) = (-1)^2\text{var}(Y) = \text{var}(Y)$$ and so

$$\text{var}(X-Y) = \text{var}(X + (-Y)) = \text{var}(X+Z) = \text{var}(X) + \text{var}(Z) = \text{var}(X) + \text{var}(Y)$$ with nary an explicit mention of the word covariance.

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200
  • 3
    +1 since this only requires the fact that the OP already mentioned in the question: ${\rm var}(X+Y)={\rm var}(X)+{\rm var}(X)$ when $X,Y$ are independent, – Macro Jun 29 '12 at 03:00
11

Let $X,Y$ be random variables with variances $\sigma^{2}_{x}$ and $\sigma^{2}_{y}$, respectively. It is a fact that ${\rm var}(Z) = {\rm cov}(Z,Z)$ for any random variable $Z$. This can be checked using the definition of covariance and variance. So, the variance of $X-Y$ is

$$ {\rm cov}(X-Y,X-Y) = {\rm cov}(X,X)+{\rm cov}(Y,Y)-2\cdot{\rm cov}(X,Y) $$

which follows from bilinearity of covariance. Therefore,

$$ {\rm var}(X-Y) = \sigma^{2}_{x} + \sigma^{2}_{y} - 2\cdot{\rm cov}(X,Y) $$

when $X,Y$ are independent the covariance is 0 so this simplifies to $\sigma^{2}_{x} + \sigma^{2}_{y}$. So, the variance of the difference of two independent variables is the sum of the variances.

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200
Macro
  • 40,561
  • 8
  • 143
  • 148