We consider the least square problem in the case where we got only one independant variable $x_i$ and only one dependant variable $y_i$. The number of observations is $n$.
In the case of the linear fit, we want to estimate $y_i$ with a function $f(x_i,µ) = µ_0 + x_i * µ_1$ under the constraint of minimizing $\sum_i{(y_i-f(x_i,µ))²}$.
The solutions can be expressed in the simple form :
- $µ_1 = \frac{covariance(x_i,y_i)_{i=1..n}}{variance(x_i)_{i=1..n}}$
- $µ_0 = \frac{(\overline{y} - \overline{x} * µ_1)}n$
In the case of the quadratic fit, we got instead $f(x_i,µ) = µ_0 + x_i * µ_1 + x_i²*µ_2$.
Is there a way to express $µ_0$, $µ_1$ and $µ_2$ in an simple form ?