1

How to calculate a variance-covariance matrix of coefficients for multivariate (multiple) linear regression?

Something like (equation below), but for the multivariate case.

Being more specific I'm interested in equations for diagonal terms.

enter image description here

The question is different from:

How to derive variance-covariance matrix of coefficients in linear regression

Because of the MULTIVARIATE regression case.

Michael D
  • 583
  • 1
  • 3
  • 23
  • 2
    Does this answer your question? [How to derive variance-covariance matrix of coefficients in linear regression](https://stats.stackexchange.com/questions/68151/how-to-derive-variance-covariance-matrix-of-coefficients-in-linear-regression) – Pluviophile May 19 '20 at 08:56
  • @Pluviophile, No. On the above link, it's a univariate case. My question is on the multivariate case. – Michael D May 19 '20 at 09:15

1 Answers1

2

The general equation is the same for both the univariate and multivariate cases,

$$ V[\hat{\beta}] = V[(X^{T}X)^{-1}X^{T}Y]\\ = (X^{T}X)^{-1}X^{T}V[Y]X(X^{T}X)^{-1}\\ = \sigma^2(X^{T}X)^{-1} $$

This is unfortunately challenging to calculate. Examining $X^{T}X$ we see (assuming an intercept is included in the model),

$$X^{T}X = \begin{bmatrix} n & \sum_{i=1}^{n}x_{i1} & \sum_{i=1}^{n}x_{i2} & \ldots& \sum_{i=1}^{n}x_{ik}\\ \sum_{i=1}^{n} x_{i1} & \sum_{i=1}^{n} x_{i1}^2 & \sum_{i=1}^{n} x_{i1}x_{i2} & \ldots & \sum_{i=1}^{n}x_{i1}x_{ik} \\ \vdots & \vdots & \vdots & & \vdots\\ \sum_{i=1}^{n}x_{ik} & \sum_{i=1}^{n} x_{ik}x_{i1} & \sum_{i=1}^{n} x_{ik}x_{i2} & \ldots & \sum_{i=1}^{n} x_{ik}^2 \end{bmatrix}$$

which is symmetric, but does not have a clean expression for the inverse. Therefore the simple formulas for the univariate case are not available for the multivariate case.

David Luke Thiessen
  • 1,232
  • 2
  • 15