5

In Parameter estimation and inference in the linear mixed effects model, page 1923, the variance

\begin{equation} \begin{aligned} \text{var}(\tilde{u} - u) & = \sigma^2G - \text{var}(\tilde{u}) \\ & = \text{var}(u) - \text{var}(\tilde{u}), \end{aligned} \end{equation}

where $\tilde{u} = GZ^\top H^{-1}(y - X\hat{\beta})$ is the best linear unbiased predictor (BLUP) for the random effects vector $u$, where $G$ and $H$ are covariance matrices, $Z$ is a design matrix, $y$ is a vector of observations, and $\hat{\beta}$ is the maximum likelihood (ML) estimate for $\beta$.

By definition,

\begin{equation} \text{var}(\tilde{u} - u) = \text{var}(u) + \text{var}(\tilde{u}) - 2\text{cov}(\tilde{u}, u), \end{equation}

this must mean that $\text{cov}(\tilde{u}, u) = \text{var}(\tilde{u})$. How can one show that $\text{cov}(\tilde{u}, u) = \text{var}(\tilde{u})$?

JLee
  • 813
  • 3
  • 12
  • I strongly doubt your reference makes such an invalid *general* assertion about variances: are you sure you transcribed it correctly? Try as I might, I cannot find anything like it on p. 1923. – whuber Dec 31 '19 at 20:16
  • On page 1923 (part of Lemma 1) it is stated that $\text{var}(\tilde{u} - u) = \sigma^2G - \text{var}(\tilde{u})$, and $\text{var}(u) = \sigma^2G$ (see page 1922, Equation (6)). – JLee Dec 31 '19 at 20:27
  • That's crucial contextual information, because it completely changes what you are asking! – whuber Dec 31 '19 at 21:07
  • 1
    Okay yes, sorry. I have edited the question to add that piece of information. – JLee Dec 31 '19 at 21:16

1 Answers1

2

We have that $$\mbox{cov}(u, \tilde u) = E \Bigl [ \bigl \{u - E(u) \bigr \} \, \bigl \{ \tilde u - E(\tilde u)\bigr \} \Bigr ].$$

But $E(\tilde u) = u$ and $E(u) = \tilde u$. Note that expectations are here taken with respect to the posterior of the random effects, not the prior. Hence, $$\mbox{cov}(u, \tilde u) = E \Bigl [ \bigl \{\tilde u - E(\tilde u) \bigr \} \, \bigl \{ \tilde u - E(\tilde u)\bigr \} \Bigr ] = \mbox{var}(\tilde u).$$

Dimitris Rizopoulos
  • 17,519
  • 2
  • 16
  • 37
  • Using $E[\tilde{u}] = u$ and $E[u] = \tilde{u}$, should you not obtain $cov(u, \tilde{u}) = E\Big[\{ E(\tilde{u}) - \tilde{u}\}\{\tilde{u} - E(\tilde{u})\}\Big]$? – JLee Jan 01 '20 at 21:40