Based on the question Residual Sum of squares in Weighted regression, a fast way to solve for $$(\mathbf{y-X\hat{\boldsymbol\beta}})^{'}\mathbf{C}^{-1}(\mathbf{y-X\hat{\boldsymbol\beta}})$$ is to transform the probelm to a regular linear regression through the following procedure
1) The cholesky decomposition of $\mathbf{C}$ is $\mathbf{C=RR^{'}}$ and $\mathbf{C^{-1}=R^{'-1}R^{-1}}$
2) The estiamtor of ${\boldsymbol\beta}$ can be written as : $$\hat{\boldsymbol\beta}=(\mathbf{X^{'}C^{-1}X})^{-1}\mathbf{X^{'}C^{-1}y}=(\mathbf{X^{'}R^{'-1}R^{-1}X})^{-1}\mathbf{X^{'}R^{'-1}R^{-1}y}$$
3) Solving $\mathbf{R^{-1}X=A}$ and $\mathbf{R^{-1}y=B}$ with backsolve we get $$\hat{\boldsymbol\beta}=(\mathbf{A^{'}A})^{-1}\mathbf{A^{'}B}$$ which is equivalent to the residual sum of squares (RSS) of unweighted regression.
In $3)$ the backsolve operator is really faster than inverting a matrix and solving for the unweighted regression estimator $\hat{\boldsymbol\beta}$ can be done really fast using the $lm$ function
My question is related to the case when $\mathbf{C}$ is a positive definite symmetric matrix of the form
$$\pmatrix{A & 0 & 0 & E \\ 0 & B & 0 & F \\ 0 & 0 & C & G \\ E^\prime & F^\prime & G^\prime & D}$$
where matrices $A,B,C,D,$ are positive definite symmetric matrices. Invering this matrix can be done using the schurr complement as in the question Inverse of block covariance matrix. However using the schur complement to find $\mathbf{C^{-1}}$ and then solving for $(\mathbf{y-X\hat{\boldsymbol\beta}})^{'}\mathbf{C}^{-1}(\mathbf{y-X\hat{\boldsymbol\beta}})$ where $\hat{\boldsymbol\beta}=(\mathbf{X^{'}C^{-1}X})^{-1}\mathbf{X^{'}C^{-1}y}$ seems far less efficient then the cholesky decomposition procedure.
Is there a way to further simplify the cholesky decomposition procedure when finding the RSS given that the matrix $\mathbf{C}$ is of the form above ? For instance is there an efficient way to calculate the cholesky decomposition of the sparse matrix $\mathbf{C}$ ?