0

I'm messing around with the SVD to find a best fit solution. The way I understand (never taking a stat class, only linear alg.) is that that the SVD captures the data variation by its projection onto however many dimensions make up the data. So if A is some data matrix and the columns of A are the dimensions, then \begin{equation} A = U \Sigma V^*\;\; \end{equation}

If the data matrix A is made up of independent dimensions and has some corresponding output vector b, then you can assume some best fot x \begin{equation} Ax =b \;\; , \end{equation} all together so \begin{equation} V \Sigma^{-1} U^* (Ax) = V \Sigma^{-1} U^* (b) \;\; , \end{equation} eliminates A, so \begin{equation} x = V \Sigma^{-1} U^* (b) \;\; , \end{equation}

I hope the equations are correct, I referred tp my python code and some of its my code and some imported libraries.

My issue if I make a tall skinny A matrix made up of two columns (say x1 and x2) with a large range and write an output vector b so that $b(x1,x2) = 7x1 + 4x2$. My best fit x is (7,4) exactly. So if I dot my data matrix A with the vector (7,4) it projects the inputs and sums them together and gives the correct output.

If I make a 3D matrix (x1, x2, x3) where $b(x1,x2,x3) = 7x1 + 4x2 + 2x3^2$. it does not find a best fit x that solves for the output. I do get a vector that attempts to guess the outputs but it fails. Does the SVD method I'm using to find the best fit x, only work for linear inputs? Is there a way to use this best fit method solution for polynomials?

Krits
  • 41
  • 3
  • You are calculating linear regression. To handle polynomials you extend your inputs with non linear transformation of your original inputs..eg $x4=x3^2$ – seanv507 Jan 26 '21 at 06:19
  • Sorry I had to edit my answer if you noticed. So my x3 column vector can not be squared. Like y = x^2 can not be approximated with my use of the SVD? – Krits Jan 26 '21 at 06:29
  • no your svd solution is calculating *linear* regression, it will find the coefficients on your inputs that best approximate b. so for your 3d matrix you would need to add $x4=x3^2$ and the coefficients would be $x=(7,4,0,2)$ – seanv507 Jan 26 '21 at 11:11
  • wow, it totally worked. Thank you! My best fit vector came out as x = 7,4,0,3. Is there a quick intuitive reason why that worked, I can't really picture why. – Krits Jan 26 '21 at 17:53
  • maybe this explanation will help https://stats.stackexchange.com/questions/92065/why-is-polynomial-regression-considered-a-special-case-of-multiple-linear-regres/92087#92087 – bogovicj Jan 26 '21 at 19:38

0 Answers0