3

I know that the calculation of parameter values of a standard OLS can be made more efficient using a QR decomposition;

i.e. if $X=QR$ and we are using the model $Y=X\beta+\epsilon$;

Then it is true that $R\beta=Q^TY$ and therefore we can make the computation of $\hat{\beta}$ more efficient.

My question is, is there an equivalent QR decomposition trick for the generalised least squares estimator:

$\hat{\beta}=(X^TWX)^{-1}X^TWY$

JDoe2
  • 576
  • 3
  • 14

1 Answers1

2

If by QR decomposition trick you mean "can one use a QR decomposition to fit a weighted least squares?", then the answer is yes, provided one knows the value of $W$ and $W$ is positive definite.

If $W$ is positive definite then you can take the Cholesky decomposition of $W$, call it $chol(W)$ so that $chol(W)^Tchol(W)=W$. Then you can write $chol(W)X=\tilde{X}$ and $chol(W)Y = \tilde{Y}$. Then take the QR decomposition of $chol(W)X$ and you should have your result.

In practice the positive definite restriction on $W$ is not too restrictive. Generalized linear models for example have $W$ a diagonal matrix of weights calculated from previous iteration of weighted least squares solution. The weights are all non-negative by construction so this obviates or at least sidesteps the p.d. issue.

Lucas Roberts
  • 3,819
  • 16
  • 45