2

When estimating a linear model $$ Y_i = X_i\beta + \varepsilon_i \quad \quad 1\leq i\leq n$$ We have $\hat{\beta}$ the least squares estimation of the slope and the estimation of the variance, $S^2 = \frac{1}{n-2}\sum_{i=1}^nr_i^2$ where $r_i$ are the residuals for the least squares estimation.

When the errors are independent and $\varepsilon_i \sim \mathcal{N}\left(0,\sigma^2\right)$ it is possible to prove that $\hat{\beta}$ and $S^2$ are independent statistics.

If we drop the normality. And we only assume independent and identically distributed errors with zero mean and variance $\sigma^2$. Does independence of the statistics still hold?

If it doesn't, is there a characterization of the distributions that turn into independent statistics?

cardinal
  • 24,973
  • 8
  • 94
  • 128
Manuel
  • 1,517
  • 12
  • 19

1 Answers1

5

Take $X_i = 1$ for all $i$ ; your question boils down to this one: the answer given here is that the independence of the sample mean $\hat \beta$ and $S^2$ occurs only when the error is normal.

Elvis
  • 11,870
  • 36
  • 56