3

I know there's a similar post about this, but I believe my question is a bit different.

In my textbook the author rewrites

$-2(\hat{\beta}_1-\beta_1)\sum u_i (x_i-\bar{x})$

into

$-2(\hat{\beta}_1-\beta_1)^2\sum (x_i-\bar{x})^2$

He doesn't use any expectation or variance operator.

If you don't want to go through my whole calculation, then you can check my final result:

$-2(\hat{\beta}_1-\beta_1)((\hat{\beta}_1-\beta_1)\sum(x_i-\bar{x})^2 +\sum\hat{u}_i(x_i-\bar{x}))$

How can I get rid of the last term?

I started the following:

use

$u_i=y_i-\beta_0-\beta_1x_i$

and substitute into first equation:

$-2(\hat{\beta}_1-\beta_1)\sum (y_i-\beta_0-\beta_1x_i) (x_i-\bar{x})$

use

$y_i=\hat{y}_i+\hat{u}_i$ -> $y_i=\hat{\beta}_0+ \hat{\beta}_1 x_i+\hat{u}_i$

and substitute into the following equation

$-2(\hat{\beta}_1-\beta_1)\sum (y_i-\beta_0-\beta_1x_i) (x_i-\bar{x})$

->

$-2(\hat{\beta}_1-\beta_1)\sum ((\hat{\beta}_0+ \hat{\beta}_1 x_i+\hat{u}_i)-\beta_0-\beta_1x_i) (x_i-\bar{x})$

simplifying the equation above gives

$-2(\hat{\beta}_1-\beta_1)\sum (\hat{\beta}_0+ \hat{\beta}_1 x_i+\hat{u}_i-\beta_0-\beta_1x_i) (x_i-\bar{x})$

multiply with $(x_i-\bar{x})$

$-2(\hat{\beta}_1-\beta_1)\sum (\hat{\beta}_0(x_i-\bar{x})+ \hat{\beta}_1 x_i(x_i-\bar{x})+\hat{u}_i(x_i-\bar{x})-\beta_0(x_i-\bar{x})-\beta_1x_i(x_i-\bar{x}))$

set brackets before summation operator

$-2(\hat{\beta}_1-\beta_1)(\sum (\hat{\beta}_0(x_i-\bar{x})+ \hat{\beta}_1 x_i(x_i-\bar{x})+\hat{u}_i(x_i-\bar{x})-\beta_0(x_i-\bar{x})-\beta_1x_i(x_i-\bar{x})))$

simplify further

$-2(\hat{\beta}_1-\beta_1)((\hat{\beta}_0\sum(x_i-\bar{x})+ \hat{\beta}_1 \sum x_i(x_i-\bar{x})+\sum\hat{u}_i(x_i-\bar{x})-\beta_0\sum(x_i-\bar{x})-\beta_1\sum x_i(x_i-\bar{x})))$

$\sum(x_i-\bar{x})$ equals zero, therefore

$-2(\hat{\beta}_1-\beta_1)(\hat{\beta}_1 \sum x_i(x_i-\bar{x})+\sum\hat{u}_i(x_i-\bar{x})-\beta_1\sum x_i(x_i-\bar{x}))$

use $\sum x_i(x_i-\bar{x}) = \sum(x_i-\bar{x})^2$

$-2(\hat{\beta}_1-\beta_1)(\hat{\beta}_1 \sum(x_i-\bar{x})^2 +\sum\hat{u}_i(x_i-\bar{x})-\beta_1\sum(x_i-\bar{x})^2))$

factorize $\sum(x_i-\bar{x})^2$

$-2(\hat{\beta}_1-\beta_1)((\hat{\beta}_1-\beta_1)\sum(x_i-\bar{x})^2 +\sum\hat{u}_i(x_i-\bar{x}))$

How do I get rid now of the last term?

Silverfish
  • 20,678
  • 23
  • 92
  • 180
S. Ming
  • 145
  • 1
  • 3
  • 14
  • Possible duplicate of [Minimum variance linear unbiased estimator of $\beta_1$](http://stats.stackexchange.com/questions/231356/minimum-variance-linear-unbiased-estimator-of-beta-1) – Xi'an Sep 21 '16 at 19:10

2 Answers2

4

We want to show that $\sum_{i=1}^{n} (y_i - \hat{\beta}_0 - \hat{\beta}_1 x_i) (x_i - \bar{x}) = 0$, which is the same as $\sum_{i=1}^{n} (\hat{\beta}_0 + \hat{\beta}_1 x_i ) (x_i - \bar{x}) = \sum_{i=1}^{n} y_i (x_i - \bar{x})$. This roughly says that the weighted average of fitted $y$ values equals the weighted average of actual $y$ values, using weights $x_i - \bar{x}$. For this we just need to do some algebra and remember the definitions $\hat{\beta}_0 = \bar{y} - \hat{\beta}_1 \bar{x}$ and $\hat{\beta}_1 = \sum_{i=1}^{n} (y_i - \bar{y}) (x_i - \bar{x}) / \sum_{i=1}^{n} (x_i - \bar{x})^2$. Let's start with the left hand side

\begin{align} \sum_{i=1}^{n} (\hat{\beta}_0 + \hat{\beta}_1 x_i ) (x_i - \bar{x}) &= \sum_{i=1}^{n} (\bar{y} - \hat{\beta}_1 \bar{x} + \hat{\beta}_1 x_i ) (x_i - \bar{x}) \\ &= \bar{y} \sum_{i=1}^{n} (x_i - \bar{x}) + \hat{\beta}_1 \sum_{i=1}^{n} (x_i - \bar{x})^2 . \end{align}

We know the first term is zero, and the sum of squares $\sum_{i=1}^{n} (x_i - \bar{x})^2$ cancels with the denominator of $\hat{\beta}_1$ leaving us with just $\sum_{i=1}^{n} (y_i - \bar{y}) (x_i - \bar{x}) = \sum_{i=1}^{n} y_i (x_i - \bar{x})$, which is what we wanted to show.

dsaxton
  • 11,397
  • 1
  • 23
  • 45
0

I believe I found another way to show that the last term equals zero.

Multiply the last term with $\hat{u}_i$:

$\sum \hat{u}_ix_i-\hat{u}_i\bar{x}$

The first term equals zero because the $Cov(\hat{u}_i,x_i)$ is zero according to the first order condition of $\hat{\beta}_1$ when deriving the OLS estimates.

The second term simplified with the summation operator gives

$-\bar{x}\sum \hat{u}_i$

According to first order condition of $\hat{\beta}_0$ the sum of the residuals equals zero.

S. Ming
  • 145
  • 1
  • 3
  • 14
  • 2
    This seems fairly circular to me. If we're allowing ourselves to simply assume $x_i$ and $u_i$ are uncorrelated then there's nothing to prove. The whole point is to show that they're uncorrelated. – dsaxton Sep 24 '16 at 21:41