I have a problem and I believe there must be a machine learning technique to solve it, but I am new to machine learning and I have no idea where to start.
So, I have multiple multivariate parameter vectors $\mathbf x$ and corresponding output vectors $\mathbf b$.
$\mathbf b$ was obtained by a matrix $\mathbf A$ such that $\mathbf {A x = b}$.
The data $\mathbf x$ and $\mathbf b$ that I have contain noises; anyway I would like to estimate the matrix $\mathbf A$ using the data $\mathbf x$ and $\mathbf b$ that I have.
So the problem should be solving
$$\operatorname*{arg\,min}_{\mathbf A} \| \mathbf{AX - B} \|$$
where $\mathbf X$ is a matrix containing multivariate parameter vectors in its columns, and the corresponding output vectors are stored in $\mathbf B$ as its column vectors.
Can anyone guide me how can I attempt to estimate the matrix $\mathbf A$?