1

I am trying to prove the given random sampling, the $Cov(u_{i}, u_{j}) = 0$.

Here is my prove: Assume given $y_{i}, y_{j}$ with random sample, where $y_{i} = \alpha + \beta x_{i} + u_{i}$. Also assume $E[u]=0$ and $E[u|x_{i}]=0$

The idea is that, because of random sample, $y_{i}$ and $y_{j}$ are independent,

$E[y_{i}y_{j}] = E[y_{i}]E[y_{j}]$ so that

$Cov(y_{i},y_{j})= E[y_{i}y_{j}] - E[y_{i}]E[y_{j}] = E[y_{i}]E[y_{j}] - E[y_{i}]E[y_{j}] = 0$

By replacing $y_{i} = \alpha + \beta x_{i} + u_{i}$:

$Cov(\alpha + \beta x_{i} + u_{i},\alpha + \beta x_{j} + u_{j}) = 0$ has to be proved in order to demonstrate with random sampling, the $Cov(u_{i}, u_{j}) = 0$.

Using the bilinearity of covariance I can prove that:

$Cov(\alpha + \beta x_{i} + u_{i},\alpha + \beta x_{j} + u_{j}) = E[(\alpha +\beta x_{i} + u_{i} - E[\alpha +\beta x_{i} + u_{i}])((\alpha +\beta x_{i} + u_{i} - E[\alpha +\beta x_{i} + u_{j}])] = E[(\alpha +\beta x_{i} + u_{i} - \alpha -\beta E[x_{i}] - E[u_{i}]])(\alpha +\beta x_{j} + u_{j} - \alpha -\beta E[x_{j}] - E[u_{j}]])] = E[(\beta(x_{i} - E[x_{i}])+(u_{i}-E[u_{i}]))(\beta(x_{j} - E[x_{j}])+(u_{j}-E[u_{j}]))] = E[\beta^2(x_{i} - E[x_{i}])(x_{j} - E[x_{j}]) + \beta(x_{i} - E[x_{i}])(u_{j} - E[u_{j}]) + \beta(u_{i} - E[u_{i}])(x_{j} - E[x_{j}]) + (u_{i} - E[u_{i}])(u_{j} - E[u_{j}])] = \beta^2 Cov(x_{i}, x_{j}) + \beta E[(x_{i} - E[x_{i}])(u_{j} - E[u_{j}])] + \beta E[(u_{i} - E[u_{i}])(x_{j} - E[x_{j}])] + Cov(u_{i}, u_{j}) $

My question is how do we eventually get:

$Cov(\alpha + \beta x_{i} + u_{i},\alpha + \beta x_{j} + u_{j}) = \beta^2 Cov(x_{i}, x_{j}) + Cov(u_{i}, u_{j})$

This eventual formula is written in an econometrics course on Youtube, and I was just trying to prove it..

If we assume that $E[u|x_{i}] = 0$, are we also saying $E[u_{j}|x_{i}] = 0 or E[u_{i}|x_{j}] = 0$ ? I guess not.... but I have no idea how $\beta E[(x_{i} - E[x_{i}])(u_{j} - E[u_{j}])] = 0$ and $\beta E[(u_{i} - E[u_{i}])(x_{j} - E[x_{j}])] = 0 $ so that $Cov(\alpha + \beta x_{i} + u_{i},\alpha + \beta x_{j} + u_{j}) = \beta^2 Cov(x_{i}, x_{j}) + Cov(u_{i}, u_{j})$

Can anyone help me with it? Much obliged!

ACuriousCat
  • 113
  • 6
  • Do you mean: Given $Y_i, Y_j, X_i, X_j$ are independent from each other, and $\alpha$ and $\beta$ are constant, prove that $Cov(u_i, u_j) = 0$? – user158565 Dec 10 '18 at 21:44
  • Yes, that's correct. – ACuriousCat Dec 10 '18 at 21:45
  • $u_i=Y_i - \alpha - \beta X_i$ and $u_j=Y_j - \alpha - \beta X_j$. There is no overlap, so they are independent. – user158565 Dec 10 '18 at 21:48
  • Is there any more formal proof of this? Thanks! – ACuriousCat Dec 10 '18 at 21:51
  • What is your definition of "random sample"? In all those of which I am aware, it *stipulates* that $(x_i,y_i)$ and $(x_j,y_j)$ are independent, and as @user158565 has pointed out, the $u_i$ are functions of the $(x_i,y_i)$ and therefore are also independent: see https://stats.stackexchange.com/questions/94872. – whuber Dec 10 '18 at 22:27

1 Answers1

1

From $y_{i} = \alpha + \beta x_{i} + u_{i}$ we have $u_i = y_i - \alpha -\beta x_i$. Given that $Y_i, Y_j, X_i, X_j$ are independent from each other, and α and β are constant, we have:

$Cov(u_i,u_j) = Cov(y_i -\alpha - \beta x_i,y_j -\alpha - \beta x_j) = Cov(y_i - \beta x_i,y_j - \beta x_j)$,

because $\pm$constant will not change the covariance. Because $\left(\begin{matrix}y_i -\beta x_i\\y_j-\beta x_j\end{matrix}\right) = \left(\begin{matrix} 1&-\beta & 0 &0\\0& 0 &1 &-\beta\end{matrix}\right) \left(\begin{matrix} y_i\\x_i\\y_j\\x_j\end{matrix}\right)$

$Cov(y_i - \beta x_i,y_j - \beta x_j)$ can be got by

$$Var\left(\begin{matrix}y_i -\beta x_i\\y_j-\beta x_j\end{matrix}\right) = \left(\begin{matrix} 1&-\beta & 0 &0\\0& 0 &1 &-\beta\end{matrix}\right)\left(\begin{matrix} V(y_i)&0&0&0\\0 & V(x_i)& 0&0\\ 0& 0& V(y_j)&0\\0 &0&0 & V(x_j)& 0&0\end{matrix}\right)\left(\begin{matrix} 1 & 0\\-\beta & 0\\ 0 &1\\1 &-\beta\end{matrix}\right)$$ $$=\left(\begin{matrix} V(y_i)+\beta^2V(X_i)& 0\\ 0 & V(y_j)+\beta^2V(X_j)\end{matrix}\right) $$

So their covariance = 0 and are independent.

user158565
  • 7,032
  • 2
  • 9
  • 19
  • Thanks for your reply, but I am wondering why the diagonal matrix indicates independence? Sorry my linear algebra is quite weak... – ACuriousCat Dec 10 '18 at 22:23
  • Oh....!!! They are linearly independent.....got it! Thanks a lot! – ACuriousCat Dec 10 '18 at 22:26
  • 1
    off diagonal elements are the covariance. So when covariance = 0, they are independent. BTW, diagonal elements are the variances. – user158565 Dec 10 '18 at 22:26
  • Thanks a lot. I definitely need to have a thorough review in linear algebra. Thank you very much again for your time and kindness! – ACuriousCat Dec 10 '18 at 22:27