I got as far as distributing the summation across the Left Side so that I have:
$$ \sum_i y_i^2 - \sum_i 2 y_i \bar{y} + \sum_i \bar{y}^2 $$
Not sure where to go from there.
I got as far as distributing the summation across the Left Side so that I have:
$$ \sum_i y_i^2 - \sum_i 2 y_i \bar{y} + \sum_i \bar{y}^2 $$
Not sure where to go from there.
A very common trick in these situations where you have an expression on the left and an expression on the right involving a term that doesn't appear on the left is to either add and subtract or multiply and divide by that term, depending on context. Here you can try
\begin{align} \sum_{i=1}^{n} (y_i - \bar{y})^2 &= \sum_{i=1}^{n} (y_i - \hat{y}_i + \hat{y}_i - \bar{y})^2 \\ &= \sum_{i=1}^{n} (y_i - \hat{y}_i)^2 + \sum_{i=1}^{n} (\hat{y}_i - \bar{y})^2 + 2 \sum_{i=1}^{n} (y_i - \hat{y}_i) (\hat{y}_i - \bar{y}) . \end{align}
Now we just need to show that the rightmost term equals zero. You should be able to convince yourself that $\sum_{i=1}^{n} (y_i - \hat{y}_i) = 0$ by plugging in the formula for $\hat{y}_i$ so we only need to prove that $\sum_{i=1}^{n} (y_i - \hat{y}_i) \hat{y}_i = 0$,
\begin{align} \sum_{i=1}^{n} (y_i - \hat{y}_i) \hat{y}_i &= \sum_{i=1}^{n} (y_i - \hat{y}_i) (\bar{y} - \hat{\beta}_1 \bar{x} + \hat{\beta}_1 x_i) \\ &= \hat{\beta}_1 \sum_{i=1}^{n} (y_i - \hat{y}_i) (x_i - \bar{x}) \end{align}
and the remaining steps can be found in my answer to this question: Proof that $\hat{\sigma}^2$ is an unbiased estimator of $\sigma^2$ in simple linear regression.