4

I am trying to solve the following problem.

Prove that for $X_n\sim \operatorname{Beta}(n,n)$ , $X_n $ converges in probability to $\frac {1}{2}$.

This is what I tried:

Since as $ n\rightarrow \infty$ , $\mathbb{E} (X_n)\rightarrow \frac {1}{2}$ , and $ \operatorname{Var}(X_n$) $ \rightarrow 0 $

$\implies$ $X_n$ is a consistent estimator of $\frac {1}{2}$ and hence converges in probability to $\frac {1}{2}$.

Is this correct?

Chill2Macht
  • 5,639
  • 4
  • 25
  • 51
ANUJ NAIN
  • 613
  • 1
  • 5
  • 12

1 Answers1

1

Saying that $X_n$ is a 'consistent' estimator doesn't really help to clarify the reasoning here, since there are different types of consistency corresponding to the different types of convergence. You know that $n \rightarrow \infty$ gives $\mathbb{E}(X_n) \rightarrow \frac{1}{2}$ and $\mathbb{V}(X_n) \rightarrow 0$, which shows that there is mean-square convergence. This form of convergence implies convergence in probability so you are okay.

Alternatively, if you want to prove this from scratch, you can use Chebyshev's inequality:

$$\mathbb{P}\left( \left| X_n - \mathbb{E}(X_n) \right| > \epsilon \right) \leqslant \frac{\mathbb{V}(X_n)}{\epsilon^2}.$$

Substituting the moments of the symmetric beta distribution gives:

$$\mathbb{P}\left( \left| X_n - \frac{1}{2} \right| > \epsilon \right) \leqslant \frac{1}{\epsilon^2} \cdot \frac{1}{4(2n+1)}.$$

Hence, for any $\epsilon > 0$ we have:

$$\lim\limits_{n \rightarrow \infty} \mathbb{P}\left( \left| X_n - \frac{1}{2} \right| > \epsilon \right) = 0.$$

This is the requisite condition to show that $X_n \rightarrow \frac{1}{2}$ in probability.

Ben
  • 91,027
  • 3
  • 150
  • 376