0

I can use central limit theorem and say that for large N it resembles a Gaussian, due to which relative error falls off as $1/\sqrt{N}$, but is there some intuitive way of understanding it?

DDDAD
  • 101
  • 1
  • what kind of uniform distribution? – Taylor Nov 21 '17 at 14:16
  • 2
    Is this what you are asking about? https://stats.stackexchange.com/questions/3734/what-intuitive-explanation-is-there-for-the-central-limit-theorem – whuber Nov 21 '17 at 14:38
  • @Nived relative error of what? Do you mean the standard error of the mean of samples from a uniform? https://en.wikipedia.org/wiki/Variance#Basic_properties – Glen_b Nov 22 '17 at 01:57
  • @Glen By relative error, I mean the standard deviation divided by the number of observations. As the error in the sample mean goes as $\sqrt(N)$, the 'relative error' goes as 1/$\sqrt(N)$ – DDDAD Nov 22 '17 at 02:17
  • I don't follow; how does "the error in the sample mean" go as $\sqrt{N}$? What definition of error are you using? – Glen_b Nov 22 '17 at 03:54
  • I've defined error as the standard deviation – DDDAD Nov 22 '17 at 04:58

1 Answers1

1

I don't know of a perfectly intuitive explanation. I just remember that when I first learnt the central limit theorem, I was puzzled by the term $1/\sqrt N$ while it has actually a simple explanation that does not require understanding the full theorem.

Consider $N$ independent variables of variance $\sigma^2$. The variance of the sum if the sum of the individual variances: $N\sigma^2$. The standard deviation of the sum is $\sqrt{N\sigma^2}=\sqrt N\sigma$. The average is the sum divided by $N$ thus its standard deviation is $\sqrt N\sigma/N=\sigma/\sqrt N$. The standard deviation of the average thus decreases in $1/\sqrt N$: how much the empirical average is different from the mean varies in $1/\sqrt N$.

Everything relies on the fact that for independent variables the variance of sum if the sum of the variances. This is the mathematical statement to describe how independent errors sum: sometimes constructively, sometimes destructively, resulting in summing the squares.

It works like sound. When you hear two (uncorrelated) noises with the same intensity $I$, the global noise has intensity $\sqrt 2 I$ because of how constructive and destructive interferences alternate. A choir of $N$ persons is as loud as $\sqrt N$.

Benoit Sanchez
  • 7,377
  • 21
  • 43