4

I checked all the books and on-line materials I could find for the proof, but found all of them have a derivation problem, which I cannot understand.

To prove the least squares estimator is the $BLUE$ for the linear model $y = X*\beta + v$, one assumes $c = C*y$ is any linear unbiased estimator of $\beta$. Using the fact that $c$ is an unbiased estimator, we can easily obtain $(C*X-I)\beta = 0$.

Then all the books and on-line materials just conclude that $C*X-I$ must be 0. I don't understand this at all. If the equality $(C*X-I)\beta = 0$ should hold for any $\beta$, then certainly we would have $C*X-I = 0$. But I don't know any reason that $(C*X-I)\beta$ = 0 should hold for any $\beta$. Is there some one who can explain this to me?

DeltaIV
  • 15,894
  • 4
  • 62
  • 104
Alan
  • 51
  • 1
  • 2
  • When you are doing regression, *you don't know what $\beta$ is.* If you want your procedure to be BLUE, then, *it must be BLUE regardless of the true value of $\beta$*. It's not enough for it to be BLUE for just some $\beta$ (such as your particular one) because even when you have made your estimate, *you still don't know what the true $\beta$ is.* – whuber Mar 29 '17 at 16:10

1 Answers1

3

The OLS estimator is $$ \hat{\boldsymbol{\beta}} = (\mathbf{X}'\mathbf{X})^{-1}\mathbf{X}'\boldsymbol{Y} $$

The class of linear unbiased estimators is $\tilde{\boldsymbol{\beta}} = \mathbf{C}\boldsymbol{Y}$, for $\mathbf{C} = \mathbf{f}(\mathbf{X})$ (that is $\mathbf{C}$ is a matrix valued function of $\mathbf{X}$), such that $$ \begin{align} \mathbb{E}(\tilde{\boldsymbol{\beta}} \mid \mathbf{X}) &= \boldsymbol{\beta} \\ \mathbf{C}\mathbb{E}(\boldsymbol{Y} \mid \mathbf{X}) &= \boldsymbol{\beta} \\ \mathbf{C}\mathbf{X}\boldsymbol{\beta} &=\boldsymbol{\beta} \end{align} $$ where the last step follows from the linear regression model $\mathbb{E}(\boldsymbol{Y} \mid \mathbf{X}) = \mathbf{X}\boldsymbol{\beta}$.

Hence, it follows that $$ \mathbf{C}\mathbf{X} = \boldsymbol{\iota}_K $$ where $\boldsymbol{\iota}_K$ is the identity matrix of size $K$.

tchakravarty
  • 8,442
  • 2
  • 36
  • 50
  • Yes, C*X*\beta should be equal to \beta, but why should C*X be equal to I ? – Alan Feb 17 '14 at 18:48
  • @Alan From the [definition of the identity matrix](http://en.wikipedia.org/wiki/Identity_matrix). $\boldsymbol{\iota}$ is the identity matrix if $\forall$ matrices $\mathbf{a}$, $\boldsymbol{\iota}\mathbf{a} = \mathbf{a}$. – tchakravarty Feb 17 '14 at 18:50
  • My question was why should (CX-I)*\beta=0 hold for any \beta? Can not \beta be a fixed parameter vector? – Alan Feb 17 '14 at 18:56
  • @Alan What difference does that make? – tchakravarty Feb 17 '14 at 19:17
  • I suppose $(CX-I)\beta=0$ should hold only for the true parameter $\beta$. Then you CANNOT conclude that CX-I=0. For example, if true $\beta=0$ and $X=1$ and $C=2$. Then $(CX-I)\beta=0$, but $CX=2$ not 1. – Alan Feb 17 '14 at 19:32
  • No, it holds for any unbiased estimator $\tilde{\beta}$. Since we are restricting ourselves to the class of unbiased estimators, that's sufficient. – jbowman Feb 17 '14 at 20:21
  • 1
    Yes, this should hold for any unbiased estimator $\tilde{\beta}$ --- different $C$ gives different estimator. But $\beta$ in $(CX-I)\beta=0$ is the true parameter vector, not an estimator. – Alan Feb 17 '14 at 21:08