I'm messing around with the SVD to find a best fit solution. The way I understand (never taking a stat class, only linear alg.) is that that the SVD captures the data variation by its projection onto however many dimensions make up the data. So if A is some data matrix and the columns of A are the dimensions, then \begin{equation} A = U \Sigma V^*\;\; \end{equation}
If the data matrix A is made up of independent dimensions and has some corresponding output vector b, then you can assume some best fot x \begin{equation} Ax =b \;\; , \end{equation} all together so \begin{equation} V \Sigma^{-1} U^* (Ax) = V \Sigma^{-1} U^* (b) \;\; , \end{equation} eliminates A, so \begin{equation} x = V \Sigma^{-1} U^* (b) \;\; , \end{equation}
I hope the equations are correct, I referred tp my python code and some of its my code and some imported libraries.
My issue if I make a tall skinny A matrix made up of two columns (say x1 and x2) with a large range and write an output vector b so that $b(x1,x2) = 7x1 + 4x2$. My best fit x is (7,4) exactly. So if I dot my data matrix A with the vector (7,4) it projects the inputs and sums them together and gives the correct output.
If I make a 3D matrix (x1, x2, x3) where $b(x1,x2,x3) = 7x1 + 4x2 + 2x3^2$. it does not find a best fit x that solves for the output. I do get a vector that attempts to guess the outputs but it fails. Does the SVD method I'm using to find the best fit x, only work for linear inputs? Is there a way to use this best fit method solution for polynomials?