0

We consider the least square problem in the case where we got only one independant variable $x_i$ and only one dependant variable $y_i$. The number of observations is $n$.

In the case of the linear fit, we want to estimate $y_i$ with a function $f(x_i,µ) = µ_0 + x_i * µ_1$ under the constraint of minimizing $\sum_i{(y_i-f(x_i,µ))²}$.

The solutions can be expressed in the simple form :

  • $µ_1 = \frac{covariance(x_i,y_i)_{i=1..n}}{variance(x_i)_{i=1..n}}$
  • $µ_0 = \frac{(\overline{y} - \overline{x} * µ_1)}n$

In the case of the quadratic fit, we got instead $f(x_i,µ) = µ_0 + x_i * µ_1 + x_i²*µ_2$.

Is there a way to express $µ_0$, $µ_1$ and $µ_2$ in an simple form ?

Cloud Skywalker
  • 439
  • 2
  • 5
  • 2
    To be frank, it's easier if you simply go to multiple linear regression; the solutions can be derived fairly readily and written quite simply; let $x_1=x$ and $x_2=x^2$ and [polynomial regression](http://en.wikipedia.org/wiki/Polynomial_regression#Matrix_form_and_calculation_of_estimates) is [multiple regression](http://en.wikipedia.org/wiki/Linear_regression#Least-squares_estimation_and_related_techniques). The parameter estimates in multiple regression are discussed in numerous posts on site. – Glen_b Jun 11 '15 at 11:20

1 Answers1

0

Minimize $\sum_i{(y_i-(µ_0 + x_i * µ_1 + x_i²*µ_2))²}$ and you will get the normal equations for $µ_0, µ_1$ and $µ_2$.

There is a great series of videos on Khan Academy of how to do it for a simple linear regression with which you end up understanding of how to get to the normal equations that you wrote down. The rest should be plain mathematical algebra.

If you don't want to do the math, which I highly recommend take a look at this paper

jannic
  • 184
  • 1
  • 13