I'm tasked with proving that $\sum_{i=1}^{n}e_i X_i = 0$ in the context of simple linear regression.
We have that $e_i X_i = (Y_i - \hat{Y}_i)X_i = (Y_i - \hat{\beta}_0 - \hat{\beta}_1X_i)X_i.$ It follows that
$$ \sum e_i X_i = \sum X_iY_i - \hat{\beta}_0 \sum X_i - \hat{\beta}_1 \sum X_i^2. $$ Now use the fact that $\hat{\beta}_0 = \bar{Y} - \hat{\beta}_1 \bar{X}$. It follows that \begin{align} \sum e_i X_i & = \sum X_iY_i - (\bar{Y} - \hat{\beta}_1 \bar{X}) \sum X_i - \hat{\beta}_1 \sum X_i^2 \\ &= \sum X_iY_i - \bar{Y}\sum X_i - \hat{\beta}_1 \bar{X} \sum X_i - \hat{\beta}_1 \sum X_i^2 \\ &= \sum X_iY_i - \frac{\sum Y_i \sum X_i}{n} - \frac{\Bigl(\sum X_i\Bigr)^2}{n} - \hat{\beta}_1 \sum X_i^2. \tag{*} \end{align}
Now I've noticed that equation $(*)$ looks a lot like the expression for $\hat{\beta}_1$ (a solution to the normal equations), which makes me think I'm on the right path, because in proving that $\sum e_i = 0$ we use one solution to the normal equation, but I can't seem to manipulate the expression into the answer.
Thanks for any help.