Regularization reduces the magnitudes of the regression coefficients. I read that this helps reduce the variance of the model. Why exactly do smaller values of the coefficients lead to a lower variance model?
Asked
Active
Viewed 219 times
1
-
1There's a logical fallacy in the formulation of this question: it is *regularization* that reduces the variance, not the fact that many coefficients become smaller. In fact, regularization can (and often does) *increase* the magnitudes of some coefficients. – whuber Jan 15 '16 at 18:20
-
I thought it is also called shrinkage for the same reason -- that it shrinks the coefficients to smaller values. – Minaj Jan 15 '16 at 19:38
-
@whuber here is a statement from one of the text books on regression. "shrinkage methods -- techniques that constrains or regularize the coefficient estimates or equivalently that shrinks the coefficient estimates towards zero. – Minaj Jan 15 '16 at 19:41
-
(quotation slightly paraphrased to suit the context here -- but thats the message). – Minaj Jan 15 '16 at 19:52
-
2Shrinkage methods like ridge regression and the lasso shrink some norm of the entire $\beta$ vector. It is possible that the norm of the entire vector could be reduced yet some particular coefficients increase. – jld Jan 15 '16 at 21:50