2

I am wondering why residuals not independent of one another in a linear regression?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
lll
  • 181
  • 1
  • 10
  • I don't have a proper answer, but I hope you know that your username links directly to your facebook profile, and anyone can just copy paste that number to get to your timeline. I would change that if I were you, unless you don't care. – Chris C Nov 12 '15 at 03:37
  • 1
    The error term in the model does specify that the errors must be iid. However, the actual residuals from a data set with a fitted equation are not fully independent as their mean must always be zero. – J Taylor Nov 12 '15 at 04:06
  • I don't understand your question. Could elaborate with some more context? An example maybe and where you heard this from? – Zachary Blumenfeld Nov 12 '15 at 05:08
  • this is a True or False question we got and we only have the answer but we do not have an explanation and I am curious of why – lll Nov 12 '15 at 05:26

1 Answers1

4

First look at simple linear regression: $$ y_i = \mu + \beta x_i + \epsilon_i $$ for $i=1, \dotsc,n$ and the usual assumptions, specifically that the error terms $\epsilon_i$ are iid. Estimating the parameters with ordinary least squares, it is well known that the residuals $e_i= y_i - \hat{y}_i$ satisfy $\sum_{i=1}^n e_i =0$, see Why do residuals in linear regression always sum to zero when an intercept is included?. This linear restriction on the residuals show that they cannot be independent.

The case of multiple linear regression generalizes, see for instance Hat Matrix off-diagonals, residual covariance in Least Squares Regression.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467