Assume that we want to derive the standard error of $\beta_2$ in: $$y=\beta_1 + \beta_2x_2 +u$$
$u$ iid with mean $0$ variance $\sigma ^2$. The way I know how to do it, is convert it in matrix form, $$y=X\beta +u$$
then simply derive the variance of the OLS estimator of $\beta$ and take the lower right value of that matrix.
But surely there is a way to do this directly, without converting it in matrix form. What is the way to do this with a low amount of algebraic steps?