0

Suppose I wanted to make a linear fit to a dataset with vector input and output, by minimizing the least square error. Then the square error equation would be $$E = \frac{1}{2}\sum_i(W\vec{x}^{(i)} - \vec{y}^{(i)})^2$$ And I have to find the weight matrix $W$ that would minimize this. Can't I treat this as a collection of linear regression problems? Like the squared error for the j'th component of the output would be $$\frac{1}{2} \sum_i ( \vec{w}_j \cdot \vec{x}^{(i)} - y^{(i)}_j )^2$$ And each component equation is independent from the next. So can I treat these as simple linear regression problems and find the rows of the weight matrix from the one step formula of linear regression that used the pseudoinverse of the input matrix?

I think I can but I see other people use methods like gradient descent or particle swarm optimization to get to the minimum instead for this sort of problems with vector outputs so I'm confused.

Tim
  • 108,699
  • 20
  • 212
  • 390

1 Answers1

0

This is called general linear model (not to be confused with the generalized one) also known as multivariate linear regression model.

$$ \mathbf{ Y = X B + \varepsilon } $$

where all $\mathbf{Y}$, $\mathbf{X}$, $\mathbf{B}$, and $\mathbf{\varepsilon}$ are matrices. It is fitted with ordinary least squares, same as linear regression. See also Linear model vs general linear model and Why do we need multivariate regression (as opposed to a bunch of univariate regressions)? and other questions tagged as .

In R you can use the lm function for it, but just provide the dependent variable as a matrix. You can find example of usage in the documentation.

Tim
  • 108,699
  • 20
  • 212
  • 390