In the linear model $\mathbf{Y} = \mathbf{X}\beta + \epsilon$, where $\epsilon \sim N(0, \sigma^2 \mathbb{I})$, it is known the the standard error of the estimator $\hat{\beta}$ is given by
$$Var(\hat{\beta})=\sigma^2(\mathbf{X}^T\mathbf{X})^{-1}. $$
In one of the posts (How are the standard errors of coefficients calculated in a regression?), a representation was made as follows:
$$ \mathbf{X}= \begin{pmatrix}1 & x_1 \\ 1 & x_2 \\ \vdots & \vdots \end{pmatrix} \qquad \text{and} \qquad \beta = \begin{pmatrix} a \\ b \\ \vdots \end{pmatrix}$$
Question: Is $\mathbf{X}$ here just a simplifying assumption? If I multiply $\mathbf{X}\beta$ (plus the error term), then it becomes $y_i = a + bx_i + \epsilon_i$, which is nicely the linear regression model.
Question: With the result of $Var(\hat{\beta})$ above, how can I get the standard error for the slope and intercept estimates, $Var(\hat{\beta_0})$ and $Var(\hat{\beta_1})$? (Assuming $Y={\beta_0}+\beta_1X $).
Edit: In the link I gave, I was able to follow the derivation. I just didn't understand:
$$\sqrt{\widehat{\textrm{Var}}(\hat{b})} = \sqrt{[\hat{\sigma}^2 (\mathbf{X}^{\prime} \mathbf{X})^{-1}]_{22}} = \sqrt{\frac{n \hat{\sigma}^2}{n\sum x_i^2 - (\sum x_i)^2}}.$$
What is $[\cdots]_{22}$ inside the square root?
Your insights would be great.