1

I am trying to prove that equals in the following two equations given that the vector are the same in both:

The only two assumptions for both equations given are the homoskedastic assumption () and the orthogonality of the vectors and

Any help would be greatly appreciated, thank you

gunes
  • 49,700
  • 3
  • 39
  • 75
beppi
  • 11
  • 1
  • If you can layout what you've done, we can provide directions. Also, what does orthogonality mean here? Is it $E[x_1x_2]=0$ considering $x_1$ and $x_2$ are RVs? – gunes Sep 04 '20 at 12:33
  • Sorry, \[x_1\] and \[x_2\] are n*1 vectors. By orthogonality I mean that \[x_1 \cdot x_2^T = 0\] – beppi Sep 04 '20 at 12:41
  • This result becomes intuitively obvious and easy to prove once you understand how multiple regression can be carried out as a sequence of univariate regressions. See, among others, https://stats.stackexchange.com/questions/21022, https://stats.stackexchange.com/a/113207/919, and https://stats.stackexchange.com/questions/46185. – whuber Sep 04 '20 at 12:58
  • @whuber thanks for those links. I think I understand it conceptually but was struggling to put it into a mathematical proof. – beppi Sep 04 '20 at 13:03
  • The key search term, then, is "Gram-Schmidt." Or, you could use elementary Euclidean geometry, as illustrated in some of those threads I linked :-). You can also directly analyze the Normal Equations $(X^\prime X)^{-} X^\prime y = \hat\beta$ for the two models; the assumption of orthogonality enables you to simplify $X^\prime X$ in the second (two-variable) model. – whuber Sep 04 '20 at 13:39
  • I tried to do the simplification of X'X but I don't think it was getting me anywhere. What do I do after I simplify this (due to some of the values equalling zero now due to orthogonality)? – beppi Sep 04 '20 at 14:14

0 Answers0