Saying that $X_n$ is a 'consistent' estimator doesn't really help to clarify the reasoning here, since there are different types of consistency corresponding to the different types of convergence. You know that $n \rightarrow \infty$ gives $\mathbb{E}(X_n) \rightarrow \frac{1}{2}$ and $\mathbb{V}(X_n) \rightarrow 0$, which shows that there is mean-square convergence. This form of convergence implies convergence in probability so you are okay.
Alternatively, if you want to prove this from scratch, you can use Chebyshev's inequality:
$$\mathbb{P}\left( \left| X_n - \mathbb{E}(X_n) \right| > \epsilon \right) \leqslant \frac{\mathbb{V}(X_n)}{\epsilon^2}.$$
Substituting the moments of the symmetric beta distribution gives:
$$\mathbb{P}\left( \left| X_n - \frac{1}{2} \right| > \epsilon \right) \leqslant \frac{1}{\epsilon^2} \cdot \frac{1}{4(2n+1)}.$$
Hence, for any $\epsilon > 0$ we have:
$$\lim\limits_{n \rightarrow \infty} \mathbb{P}\left( \left| X_n - \frac{1}{2} \right| > \epsilon \right) = 0.$$
This is the requisite condition to show that $X_n \rightarrow \frac{1}{2}$ in probability.