In the case of simple linear regression, I understand the math behind the variance of the estimates:
$$ \operatorname{Var}(\widehat{\beta}_0) = s^2 \bigg(\frac{1}{n} + \frac{\bar{x}^2}{S_{xx}} \bigg) \hspace{12pt} \text{and} \hspace{12pt} \operatorname{Var}(\widehat{\beta}_1) = \frac{s^2}{S_{xx}} $$
However, I don't follow the steps that lead us to conclude that
$$ \operatorname{Var}(\widehat{\beta}_0 + \widehat{\beta}_1x_*) = s^2 \bigg(\frac{1}{n} + \frac{(x_* - \bar{x})^2}{S_{xx}} \bigg)$$
I can see that this resembles a sum of the variances of the parameter estimates, but I don't see how we get the $(x_* - \bar{x})^2$ term.