0

I am learning about different intervals and trying to get my head around them.

Given that a sample mean from a sample size $n$ has a pdf equal to $f(x)$, the 95% prediction interval [a,b] is an interval is the region where there is a 95% chance of observing the next sample mean. This satisfies $\int_a^b f(x) dx= 0.95$

The 95% confidence interval [u,v] is a region where 95% of the future sample means would lie in. This satisfies $\int_u^v f(x) dx= 0.95$

The equations for both are the same but I'm not sure if they are conceptually equal. Could someone please explain this in more detail?

Hugh
  • 3,659
  • 16
  • 22
  • 3
    That is not the standard definition of a $95\%$ confidence interval, which is more along the lines of applying the same methodology to repeated independent sample means with the same sample size, $95\%$ of the confidence intervals so constructed would include the population mean. I also have doubts about your description of the prediction interval, which does not seem to depend on the actual value of the sample mean observed. A difference is that the prediction interval takes account of the uncertainty of the next sample mean compared with the population mean, in addition to this sample's. – Henry Oct 29 '14 at 21:54
  • 1
    See http://stats.stackexchange.com/questions/16493/difference-between-confidence-intervals-and-prediction-intervals for more – Henry Oct 29 '14 at 21:55

0 Answers0