10

Consider a linear model

$y_i= \alpha + \beta x_i + \epsilon_i$

and estimates for the slope and intercept $\hat{\alpha}$ and $\hat{\beta}$ using ordinary least squares. This reference for a mathematical statistics makes the statement that $\hat{\alpha}$ and $\hat{\beta}$ are independent (in their proof of their theorem).

I'm not sure I understand why. Since

$\hat{\alpha}=\bar{y}-\hat{\beta} \bar{x}$

Doesn't this mean $\hat{\alpha}$ and $\hat{\beta}$ are correlated? I'm probably missing something really obvious here.

WetlabStudent
  • 436
  • 3
  • 15

1 Answers1

13

Go to the same site on the following sub-page:

https://onlinecourses.science.psu.edu/stat414/node/278

You will see more clearly that they specify the simple linear regression model with the regressor centered on its sample mean. And this explains why they subsequently say that $\hat \alpha$ and $\hat \beta$ are independent.

For the case when the coefficients are estimated with a regressor that is not centered, their covariance is

$$\text{Cov}(\hat \alpha,\hat \beta) = -\sigma^2(\bar x/S_{xx}), \;\;S_{xx} = \sum (x_i^2-\bar x^2) $$

So you see that if we use a regressor centered on $\bar x$, call it $\tilde x$, the above covariance expression will use the sample mean of the centered regressor, $\tilde {\bar x}$, which will be zero, and so it, too, will be zero, and the coefficient estimators will be independent.

This post, contains more on simple linear regression OLS algebra.

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241
  • I would consider using $Cov(\hat \alpha, \hat \beta | X)$ instead of $Cov(\hat \alpha, \hat \beta)$. Otherwise it feels that $\bar x$ and $S_{xx}$ need to be replaced by population counterparts. Or am I wrong? – Richard Hardy Apr 12 '15 at 07:35
  • Why does zero covariance imply independence? Are $\hat\alpha$ and $\hat\beta$ bivariately normally distributed? See http://probability.ca/jeff/teaching/uncornor.html. – Adrian Keister Sep 09 '21 at 18:33