Consider the linear model $Y = X \beta + e$, $e \sim N(0, V(\theta))$, where $Y$ is a $n \times 1$ vector, $X$ is the $n \times p$ full rank design matrix, $V(\theta)$ is the covariance matrix. I drop $\theta$ below for convenient expression.
B is an $n \times (n-p)$ matrix whose columns are orthonormal basis of $C(X)^{\perp}$. REML maximizes the likelihood of $B^T Y$, which can be expressed as
$$ L(\theta) = (2 \pi)^{-(n-p)/2} |B^T V B|^{-1/2} \text{exp} \{- \frac{1}{2} Y^T B (B^T V B)^{-1} B^T Y \} \tag{1} $$ where $|A|$ is the determinant of matrix $A$.
This can be proved to be equivalent as
$$L(\theta) = (2 \pi)^{-(n-p)/2} |X^T X|^{1/2} |V|^{-1/2} |X^T V^{-1} X|^{-1/2} \text{exp} \{- \frac{1}{2} Y^T V^{-1} (I - Q) Y \} \tag{2} $$ where $Q = X (X^T V^{-1} X)^{-1}X^T V^{-1}$.
I am able to prove that $$I - Q = V B (B^T V B)^{-1} B^T \tag{3} $$ by showing that they are the same projection operator onto $C(VB)$ along $C(X)$, and thus prove the $\text{exp}$ part.
However, I don't know how to prove
$$|B^T V B| = |X^T X|^{-1} |V| |X^T V^{-1} X| \tag{4} $$
Any suggestion would help.