1

Most of the statistics textbooks claim that point estimator $\hat\theta_n(\mathbf{X})$ is asymptotically normal if $$\sqrt{n}(\hat\theta_n - \theta) \, \overset{d}{\longrightarrow} \, \mathcal{N}(0, \sigma^2(\theta)), \qquad \forall \theta \in \Theta$$ where $\sigma^2(\theta)$ is called asymptotic variance of $\hat\theta_n$.

Moreover, if $\sigma(\theta)$ is a continuous function of $\theta$ ($\forall \theta \in \Theta$), we can write that

$$\frac{\sqrt{n}(\hat\theta_n - \theta)}{\sigma(\theta)} \, \overset{d}{\longrightarrow} \, \mathcal{N}(0, 1), \qquad \forall \theta \in \Theta$$

But Larry Wasserman in his book "All of statistics" give the following definition of the asymptotically normal estimator:

An estimator is asymptotically Normal if $$\frac{\hat\theta_n - \theta}{\mathrm{SE}} \, \overset{d}{\longrightarrow} \, \mathcal{N}(0, 1)$$

So we have that $\mathrm{SE} = \frac{\sigma(\theta)}{\sqrt{n}}$.

Also Wasserman previously defined standard error as $\mathrm{SE} = \sqrt{\mathrm{Var}_\theta(\hat\theta)}$.

Therefore, for any asymptotically normal estimator $\hat\theta_n$ I concluded that $$\mathrm{SE} = \sqrt{\mathrm{Var}_\theta(\hat\theta)} = \frac{\sigma(\theta)}{\sqrt{n}} \qquad (*)$$

I know that $(*)$ is true when $\hat\theta_n$ is a sample mean (in that case $\sigma(\theta) \equiv \sigma = \sqrt{\mathrm{Var}(X)}$)

But is $(*)$ true for any asymptotically normal estimator ?

P.s. In some sources I have seen that quantity $\frac{\sigma(\theta)}{\sqrt{n}}$ is called "asymptotic standard error" and actively used in building confidence intervals. Also please keep in mind that I am not talking about the estimated standard error $\widehat{\mathrm{SE}}(\hat\theta_n) = \sqrt{\widehat{\mathrm{Var}}(\hat\theta_n)}$ which is an observed value.


EDIT. mpiktas's answer seems to be the truth.
Indeed, the equality $(*)$ is correct only for sample mean $\overline{X}$. If we get another asymptotically normal estimator, for example sample variance $S^2 =\frac{1}{n-1} \sum_{i=1}^n (X_i - \overline{X})^2$, then the equality $(*)$ will not hold. It is easy to show.
On the one hand, $$\mathrm{SE}(S^2) = \sqrt{\mathrm{Var}(S^2)} = \sqrt{\frac{1}{n} \left(\mu_4 - \frac{n-3}{n-1}\sigma^4\right)}, \qquad (1)$$ where $\mu_4 = \mathrm{E}[(X_1 - \mu)^4]$ (see wiki, for example).

On the other hand, $$\frac{\sigma(S^2)}{\sqrt{n}} = \frac{\sqrt{\mu_4 - \sigma^4}}{\sqrt{n}}, \qquad (2)$$ where $\sigma(S^2)$ is a square root of the asymptotic variance of $S^2$ (see proof here).

Expressions (1) and (2) are not equal, i.e. $\mathrm{SE}(S^2) =\sqrt{\mathrm{Var}(S^2) } \neq \sigma(S^2)/\sqrt{n}$.

But we have the following convergence: $$\sqrt{n} \, \mathrm{SE}(S^2) \, \overset{n \to \infty}{\longrightarrow} \, \sigma(S^2) = \sqrt{\mu_4 - \sigma^4}.$$

Rodvi
  • 749
  • 1
  • 7
  • 17

1 Answers1

2

The answer is no.

Your definition is wrong. SE is calculated from data, $\sigma(\theta)$ is not observed and is a theoretical concept. So there cannot be a equality.

The correct conclusion which you should have made is the following

$$\sqrt{n}SE \to \sigma$$

for the asymptotically normal estimator $\hat\theta$. This is correct for all asymptotically normal estimators. (Note there might some weird cases when this does not hold related to intricacies of CLT, but since I cannot recall them at the moment I am putting this disclaimer just in case).

mpiktas
  • 33,140
  • 5
  • 82
  • 138
  • Thanks for the answer. I want to clarify - correct definition of SE is the following: $\mathrm{SE}(\hat\theta_n) = \sqrt{\mathrm{Var}(\hat\theta_n)}$, and generally $\sqrt{\mathrm{Var}(\hat\theta_n)} \neq \frac{\sigma(\theta)}{\sqrt{n}}$, right? (we can only say that $\sqrt{n}\, \mathrm{SE}(\hat\theta) = \sqrt{n \cdot \mathrm{Var}(\hat\theta_n)} \,\longrightarrow \, \sigma(\theta)$ when $n \longrightarrow \infty$) – Rodvi Aug 31 '18 at 14:11
  • It seems that you are correct! I investigated sample variance $S^2$ to check you idea (see my edit to the post above). – Rodvi Aug 31 '18 at 20:51
  • Note that SE should be an estimate too for the definition to be useful. The whole point of asymptotic normality is to have a quantity which can be calculated which is approximately normal. – mpiktas Sep 07 '18 at 05:38