1

About simple regression:

It's well known that usual OLS estimators of $β_{0}$ and $β_{1}$ have minimum variance of all unbiased linear estimators.

I wonder if there are popular biased linear estimators which have smaller variance or unbiased nonlinear estimators or biased nonlinear estimators which have better characteristics such as smaller variance.

Recenlty, I found these!!

  1. related question : OLS is BLUE. But what if I don't care about unbiasedness and linearity?

  2. related comment : " It is probably the same narrow place of understanding that OLS is just a best linear unbiased estimator (BLUE), and there exist better biased or nonlinear examples like James-Stein estimator or LASSO estimator that minimizes MSE further."

    from Intuition behind why Stein's paradox only applies in dimensions $\ge 3$

KH Kim
  • 1,121
  • 2
  • 10
  • 25
  • non-linear estimators will only outperform OLS if the nature of the relation between $y$ and $x$ is nonlinear, i.e. the first assumption of OLS, $y = β_{0} + β_{1}x$ is invalid. – mzuba May 15 '12 at 08:43
  • 1
    I doubt it. Linear estimator here refers to the form $\sum_{i=1}^{n} k_i y_i$ given $x_i$'s. Consider error terms are mixed normal. I don't think linear estimators would outperform unlinear estimators in that case. – KH Kim May 15 '12 at 15:54
  • 1
    @mzuba, non-linear estimators will outperform OLS if the relationship between $y$ and $x$ is non-linear **in the parameters**. $y = \beta x^2 + \varepsilon$ is still a linear model. – Macro May 15 '12 at 18:38
  • @KHKim, I'm not quite sure I understand the question, but regularized regression coefficient estimators (e.g. LASSO, Ridge Regression) often outperform OLS in terms of mean-squared error despite being biased, although this assumes the underlying model is linear so I'm not sure this is related. – Macro May 15 '12 at 18:40
  • I think that like Macro is saying you can improve on least squares if the error terms are not gaussian or if there are errors in the variables. Particularly in the error in variables case least sqaures is doing the wrong thing because it assumes no error in x and minimizes the squared error in y. The Gauss Markov Theorem assumes possibly non-Gaussian error term but no error in x. If you have error in x nothing even guarantees that least squares will be unbiased! – Michael R. Chernick May 15 '12 at 22:37

0 Answers0