Orthogonalization via PCA and ridge regression are two common methods to account for multicollinearity for linear regression models. When would you use one over the other?
Asked
Active
Viewed 1,690 times
3
-
1This sounds suspiciously like a homework problem. – Zach Jul 19 '12 at 17:10
-
This is not a homework problem, and I was hoping for something other than whatever works best empirically. Correct me if I'm wrong, but I believe ridge regression encodes a multivariate mean-0 normal prior on the regression parameters ... assuming that you have reason to believe the regression parameters should follow this prior, is there any a priori reason to prefer one to the other? – nan Jul 19 '12 at 18:11
-
@nan, you are correct - ridge regression is equivalent to a fitting a Guassian linear model with a Gaussian prior on the $\beta$s. Your second sentence confuses me though - if you have reason to believe that this is a good prior, then, of course, you do have an a priori reason to prefer ridge regression. Maybe I've misunderstood your query. – Macro Jul 19 '12 at 18:38
-
you may also find [this thread](http://stats.stackexchange.com/questions/32471/how-can-you-handle-unstable-beta-estimates-in-linear-regression-with-high-mul) useful. – Macro Jul 19 '12 at 18:47
1 Answers
2
When the cross-validated error of one method is lower than the other. I would also look into lasso regression and elastic net regression.

Zach
- 22,308
- 18
- 114
- 158