2

I have found the maximum likelihood estimator $\hat{\sigma}$ of a iid r.vs $X_1, ..., X_n$ which all have normal distribution with known mean $\mu$ and unknown variance $\sigma^2$.

So $\hat{\sigma}$ turns out to be $\sqrt{\frac{1}{n} \displaystyle\sum_{i=1}^n (X_i - \mu)^2} $. Now if it wasn't for the square root sign I'd have no problem working out $\mathbb{E}[\hat{\sigma}]$. Could someone please help?

Arcane1729
  • 123
  • 4

1 Answers1

4

The random variable $Q:=\sum_{i=1}^n {\left(\frac{X_i - \mu}{\sigma}\right)}^2$ has a Chi-squared distribution with $n$ degrees of freedom. Denote by $f$ its pdf, which you can find on Wikipedia or many other places. It is given by $$ f(x) = C x^{\frac{n}{2}-1}\exp\bigl(-\frac{x}{2}\bigr) $$ where $C=\frac{1}{2^{\frac{n}{2}}\Gamma\bigl(\frac{n}{2}\bigr)}$ is a constant (not depending on $x$).

Your are looking for the expectation of the random variable $$ R= \sqrt{\frac{1}{n} \displaystyle\sum_{i=1}^n (X_i - \mu)^2}. $$ Once you get the pdf of $R$, say $g$, you can get the expectation of $R$ by calculating $E(R) = \int yg(y)\mathrm{d}y$.

The random variable $R$ is a function of $Q$, namely $R=h(Q)$ where $h(x)=\sigma \sqrt{\frac{1}{n}x}$. This function is one-to-one map from $[0, \infty[$ to $[0, \infty[$, therefore you can use the change of variables formula to get the pdf of $R$. Denoting by $g$ this pdf, it is given by $$ g(y) = {(h^{-1})}'(y)\times f\bigl(h^{-1}(y)\bigr). $$ The inverse of $h$ is $h^{-1}(y) = \frac{n}{\sigma^2}y^2$. Setting $\lambda=\frac{n}{\sigma^2}$ for notational simplicity, one gets (for $y > 0$) $$ \begin{align*} g(y) & = 2 \lambda y \times f\bigl(\lambda y^2\bigr) \\ & = 2 C\lambda y^{n-1} \exp \bigl(-\frac{\lambda y^2}{2}\bigr) \end{align*}, $$ and one finally has to calculate $$ E(R) = 2 C\lambda^{\frac{n}{2}} \int_0^\infty y^{n} \exp \bigl(-\frac{\lambda y^2}{2}\bigr)\mathrm{d}y. $$

By the change of variables $x=\alpha y^2$, $$ \begin{align*} \int_0^\infty y^{n} \exp \bigl(-\alpha y^2\bigr)\mathrm{d}y & = \frac{1}{2} \alpha^{-\frac{n+1}{2}} \int_0^\infty x^{\frac{n}{2}-1} \exp \bigl(-x\bigr)\mathrm{d}x \\ & = \frac{1}{2} \alpha^{-\frac{n+1}{2}} \Gamma\Bigl(\frac{n+1}{2} \Bigr). \end{align*} $$ Thus one finally gets $$ \begin{align*} E(R) & = \frac{1}{2^{\frac{n}{2}}\Gamma\bigl(\frac{n}{2}\bigr)} {\left(\frac{n}{\sigma^2}\right)}^{\frac{n}{2}} {\left(\frac{1}{2}\frac{n}{\sigma^2}\right)}^{-\frac{n+1}{2}}\Gamma\Bigl(\frac{n+1}{2} \Bigr) \\ & = \frac{\sqrt{2}\sigma}{\sqrt{n}}\frac{\Gamma\Bigl(\frac{n+1}{2} \Bigr)}{\Gamma\bigl(\frac{n}{2}\bigr)}. \end{align*} $$

Checking:

> n <- 5
> sigma <- 2
> Q <- rchisq(10000, n)
> h <- function(x) sigma*sqrt(x/n)
> R <- h(Q)
> mean(R)
[1] 1.903887
> sqrt(2)*sigma/sqrt(n)*gamma((n+1)/2)/gamma(n/2)
[1] 1.903066
Stéphane Laurent
  • 17,425
  • 5
  • 59
  • 101