The covariance result you are looking at occurs under a standard regression model using ordinary least-squares (OLS) estimation. The OLS estimator (written as a random variable) is given by:
$$\begin{equation} \begin{aligned}
\hat{\boldsymbol{\beta}}
&= (\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} (\boldsymbol{x}^{\text{T}} \boldsymbol{Y}) \\[6pt]
&= (\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} (\boldsymbol{x} \boldsymbol{\beta} + \boldsymbol{\varepsilon}) \\[6pt]
&= \boldsymbol{\beta} + (\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} \boldsymbol{\varepsilon}.
\end{aligned} \end{equation}$$
In the standard linear regression model we have $\mathbb{E}(\boldsymbol{\varepsilon}) = \boldsymbol{0}$ and $\mathbb{V}(\boldsymbol{\varepsilon}) = \sigma^2 \boldsymbol{I}$ so that the estimator is unbiased with covariance matrix given by:
$$\begin{equation} \begin{aligned}
\mathbb{V}(\hat{\boldsymbol{\beta}})
&= \mathbb{V}((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} \boldsymbol{\varepsilon}) \\[6pt]
&= ((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} ) \mathbb{V}(\boldsymbol{\varepsilon}) ((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} )^{\text{T}} \\[6pt]
&= \sigma^2 ((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} ) \boldsymbol{I} ((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} )^{\text{T}} \\[6pt]
&= \sigma^2 ((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} ) ((\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \boldsymbol{x}^{\text{T}} )^{\text{T}} \\[6pt]
&= \sigma^2 (\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} (\boldsymbol{x}^{\text{T}} \boldsymbol{x}) (\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1} \\[6pt]
&= \sigma^2 (\boldsymbol{x}^{\text{T}} \boldsymbol{x})^{-1}.
\end{aligned} \end{equation}$$
Note that this is the conditional covariance of the estimator given the design matrix $\boldsymbol{x}$.