From population choosing samples(size n=30) and calculate its mean then repeating it N times will converge to normal distribution as N->inf when mean of each sample is plotted as a histogram.
From my understanding as size of n increase normal distribution will have smaller standard deviation, this makes sense because using larger sample size will be better at estimating population mean than smaller sample.
is my understanding correct?
edit: for clarification I am referring to n,N as follows:
- n = number of data in a sample
- N = number of time sample(of size n) is sampled