You can find the general formula for the variance of the regression coefficients on many questions on this site (e.g., here.
To facilitate our analysis, let $\sigma^2 = \mathbb{V}(\epsilon_i)$ denote the error variance. In a model with an intercept term (i.e., allowing free $\beta_1$) you have the variance:
$$\mathbb{V}(\hat{\beta}_2^U) = \frac{\sigma^2}{\sum x_i^2 - n \bar{x}^2}.$$
In a model without an intercept term (i.e., setting $\beta_1=0$) you have the variance:
$$\mathbb{V}(\hat{\beta}_2^R) = \frac{\sigma^2}{\sum x_i^2}.$$
Both of these results can be derived from the general form $\mathbb{V}(\boldsymbol{\hat{\beta}}) = \sigma^2 (\mathbb{x}^\text{T} \mathbb{x})^{-1}$ using the relevant design matrix $\mathbf{x}$ for the models with/without the intercept term (i.e., with/without a column of ones).
As to your latter question of whether $\hat{\beta}_2^R$ is biased, have a look at the theory of omitted variable bias. If the true intercept of the model is zero then, intuitively, assuming it is zero should not bias the estimator, and should improve our estimation. Contraily, if the true intercept is not zero then we would expect that assuming it to be zero might cause some problems. The formula for omitted variable bias should allow you to write the bias of your estimator as a function of the (unknown) true intercept term.
Some final notes on your working: It is worth pointing out that you are using non-standard notation for the intercept and slope terms in the model --- usually we would denote these as $\beta_0$ and $\beta_1$ respectively. Another thing to note is that the variance equations you have written cannot possibly be correct, firstly because they include the random variable of interest in them, and secondly because they do not include any reference to the variability of the error term in the model.