11

I know that the variance of the difference of two independent variables is the sum of variances, and I can prove it. I want to know where the covariance goes in the other case.

Ricky
  • 119
  • 1
  • 1
  • 3

2 Answers2

22

When $X$ and $Y$ are dependent variables with covariance $\mathrm{Cov}[X,Y] = \mathrm{E}[(X-\mathrm{E}[X])(Y-\mathrm{E}[Y])]$, then the variance of their difference is given by $$ \mathrm{Var}[X-Y] = \mathrm{Var}[X] + \mathrm{Var}[Y] - 2 \mathrm{Cov}[X,Y] $$ This is mentioned among the basic properties of variance on http://en.wikipedia.org/wiki/Variance. If $X$ and $Y$ happen to be uncorrelated (which is a fortiori the case when they are independent), then their covariance is zero and we have $$ \mathrm{Var}[X-Y] = \mathrm{Var}[X] + \mathrm{Var}[Y] $$

StijnDeVuyst
  • 2,161
  • 11
  • 17
3

Let $X$ and $Y$ be two random variables. We want to show that $Var[X-Y]=Var[X]+Var[Y]-2\times Cov[X,Y]$.

Let's define $Z:=-Y$, so we have: $Var[X-Y]=Var[X+Z]=Var[X]+Var[Z]+2\times Cov[X,Z]$.

$Var[Z] = Var[-Y] = Var[Y]$, since $Var[\alpha Y] = \alpha^2 Var[Y]\enspace \forall \alpha \in \mathbb{R}.$

We also have $Cov[X,Z] = Cov[X,-Y] = -Cov[X,Y]$, because $Cov(X,\beta Y) = \beta Cov(X,Y)\enspace \forall \beta \in \mathbb{R}$.

Putting all pieces together, we have $Var[X-Y]=Var[X]+Var[Y]-2\times Cov[X,Y]$.