I was reading a blog post by the statistician William Briggs, and the following claim interested me to say the least.
What do you make of it?
What is a confidence interval? It is an equation, of course, that will provide you an interval for your data. It is meant to provide a measure of the uncertainty of a parameter estimate. Now, strictly according to frequentist theory—which we can even assume is true—the only thing you can say about the CI you have in hand is that the true value of the parameter lies within it or that it does not. This is a tautology, therefore it is always true. Thus, the CI provides no measure of uncertainty at all: in fact, it is a useless exercise to compute one.