3

I know that the Central Limit Theorem states that if a random sample of size $n$ is drawn from iid random variables $X_1, \ldots, X_n$, then the variable $$ Z = \frac{\hat x - E(X) }{{\rm sd}(\hat x)}\text{ is }\mathcal N(0,1) $$ as $n$ is big enough (where $\hat x$ denotes the sample average).

My question is: What about $\hat x$, does it also converge to a normal distribution?

gung - Reinstate Monica
  • 132,789
  • 81
  • 357
  • 650
Driss
  • 31
  • 1
  • 5
    For intuition about the CLT, please see http://stats.stackexchange.com/questions/3734. The short answer (given by [laws of large numbers](http://stats.stackexchange.com/questions/22804/what-theories-should-every-statistician-know)) is no: $\hat x$ converges to the common expectation of the $X_i$. – whuber Aug 31 '14 at 16:28
  • Do you mean what does the *value* $\hat x$ converge to, or what does the *distribution* of $\hat x$ converge to? – gung - Reinstate Monica Aug 31 '14 at 16:46

1 Answers1

4

The central limit theorem says the variable $Z$ converges in distribution to the standard normal.

$$Z=\frac{\hat{x}-E(X)}{\text{sd}(\hat{x})} \overset{D}{\rightarrow} N(0,1)$$

Rearrange this at a fixed value of $n$ (number of samples) to get, approximately:

$$\hat{x} \sim N(E(X),\text{Var}(\hat{x})) $$

where $\text{Var}(\hat{x})=\frac{\text{Var}(X)}{n}$ is the variance of the sample mean. You might interpret this as the distribution of $\hat{x}$ tends towards being normal with mean $E(X)$ and variance $\frac{\text{Var}(X)}{n}$.

For large values of $n$, the variance will tend to $0$. which means that the normal distribution of $\hat{x}$ will actually converge to a point mass distribution at $E(X)$. Convergence in distribution to a constant implies convergence in probability to that constant. So

$$\hat{x} \overset{P}{\rightarrow} E(X)$$

Of course this is just the weak law of large numbers. (The strong law stating that the convergence in almost sure.)

Comp_Warrior
  • 2,075
  • 1
  • 20
  • 35