I am learning about different intervals and trying to get my head around them.
Given that a sample mean from a sample size $n$ has a pdf equal to $f(x)$, the 95% prediction interval [a,b] is an interval is the region where there is a 95% chance of observing the next sample mean. This satisfies $\int_a^b f(x) dx= 0.95$
The 95% confidence interval [u,v] is a region where 95% of the future sample means would lie in. This satisfies $\int_u^v f(x) dx= 0.95$
The equations for both are the same but I'm not sure if they are conceptually equal. Could someone please explain this in more detail?