0

In Casella&Berger, there is a theorem as follows:

Theorem 7.3.20 If $E_\theta W = \tau(\theta)$, $W$ is the best unbiased estimator of $\tau(B)$ if and only if $W$ is uncorrelated with all unbiased estimators of $0$.

If we assume $E(X)=0$ means X is orthogonal to the vector of $(1,1,1,1,...)$, then "uncorrelated with all orthogonal vectors to $(1,1,1,1,...)$" means being in the vector space of $(1,1,1,1,...)$. Thus, any such estimator, $W$, must be constant for all $\textbf{x}$.

What is wrong here?

EDIT:

The assumption I made comes from this text, Introduction to probability - Blitzstein:

We can think of unconditional expectation as a projection too: $E(Y)$ can be thought of as $E(Y|0)$, the projection of $Y$ onto the space of all constants.

If such assumption is false, can you elaborate what is this text trying to say and what is the correct interpretation?

EDIT: Please give feedback in words. I brought two statements from two famous books, and a contradicting result which comes from these two statements with detailed reasoning. How can I add details or clarity to this question?

Hooman
  • 27
  • 3
  • What do you mean by "the line (1,1,1,1,...)"? – Jonny Lomond Feb 26 '22 at 18:54
  • vector of constants. in $R^2$: the vector of [1, 1] – Hooman Feb 26 '22 at 18:57
  • 1
    Everything is predicated on "if we assume...," but that assumption is generally false. – whuber Feb 26 '22 at 21:47
  • If one variable is constant, its correlation with any other variable is undefined, not zero. A way to see this is to note that its variance (SD) is necessarily zero, and the correlation calculation fails because of division by zero. – Nick Cox Feb 27 '22 at 10:16
  • 1
    @whuber I added a piece of text. do you mean it's wrong? If yes, does "taking the mean of random variable" have any geometric interpretation? – Hooman Feb 27 '22 at 12:41
  • Of course it does. But that's not relevant here. Where do you see any contradiction between the quoted statements? Elaborating on that will help us determine how to answer your question. As I see it, an "estimator of $0$" is a function of a random sample that has an expectation of zero *no matter what the underlying distribution might be.* Two such functions will be a bivariate random variable, so provided they are both nonconstant and have finite variances, they will have a correlation--and that correlation is wholly unrelated to their expectations, anyway. – whuber Mar 05 '22 at 19:47
  • Many related answers have been posted here already, so take a look at this [site search](https://stats.stackexchange.com/search?q=estimator+of+0+best+unbiased+estimator+is%3Aanswer+score%3A2). Reading a few of those posts might help you frame your question clearly and demonstrate it's a new one. The answer at https://stats.stackexchange.com/a/196996/919 looks right on target. – whuber Mar 05 '22 at 19:49

0 Answers0