1

Our model is $Y=\beta_0+\beta_1X+U$.

We know that $$\hat{\beta_0} = \beta_0 + \sum\limits_{n=1}^N c_nu_n \quad \text{ and }\quad \hat{\beta_1} = \beta_1 + \sum\limits_{n=1}^N k_nu_n \,,$$ where $$k_n = \frac{(x_n-\bar{X})}{D}\,, \quad D = \sum_{n=1}^N (x_n-\bar{X})^2\,,\quad\text{ and }\quad c_n = \Big[\frac{1}{N}-\bar{X}k_n\Big].$$

We are trying to use the assumptions of autocorrelation and homoskedasticity to derive the covariance between the OLS slope and intercept estimators, $\mathrm{Cov}[\hat{\beta_0},\hat{\beta_1}]$.

I'm not sure how to go about this, or how to use these assumptions?

I know how to begin using the definition of covariance, and then substituting to remove the coefficients, but where do I go from there? I'm not sure how to continue.

Chill2Macht
  • 5,639
  • 4
  • 25
  • 51
Jon
  • 11
  • 2
  • Could you edit your post to emphasize the question? It reads like a statement as it is currently written – Marquis de Carabas May 05 '16 at 04:56
  • 1
    Maybe the thread ["Correlation between OLS estimators for intercept and slope"](http://stats.stackexchange.com/questions/171125/correlation-between-ols-estimators-for-intercept-and-slope) will be helpful. – Richard Hardy May 05 '16 at 07:06

0 Answers0