0

We know that standard error is the standard deviation of the sample mean from true mean. If we assume that our population follows normal distribution then the distribution of sample mean will follow t-distribution where every point of t-distribution along the x-axis is the set of possible sample mean one can get for every fixed sample size.

So is the standard deviation of t-distribution is equal to the standard error for a given data set? Because the above two terms express the same thing and that is the standard deviation of sample mean from true mean.

If I am wrong and something I misunderstood, please some can someone explain it to me.

Karolis Koncevičius
  • 4,282
  • 7
  • 30
  • 47
  • Your assertions are not correct: when the population follows a Normal distribution, the sample mean also follows a Normal distribution. Moreover, when the population follows *any* distribution with finite variance $\sigma^2,$ then the mean of a sample of size $n$ follows a distribution with variance $\sigma^2/n.$ The Student t distribution plays no role. For an intuitive account of this, see https://stats.stackexchange.com/a/18609/919. – whuber Nov 20 '19 at 16:03

1 Answers1

0

I guess you are half-right. To get Standard error you need standard deviation of sample divided by root N. You seem to be dealing with the sample size only. There could be various standard deviation values within the group of sample with same N, and their standard error will not agree.

Kang Inkyu
  • 441
  • 1
  • 3
  • 9