0

For a regression model where you are certain that y that depends on some predictors but are agnostic about whether some other predictors should enter, how should you incorporate this prior information? The elastic net approach penalizes a weighted sum of the absolute values and squared values of the regression coefficients, but you may want to penalize some coefficients more strongly than others. Has there been research on elastic net (or lasso or ridge) regression with uneven penalties for the predictors? I'm assuming that all predictors have been scaled to have zero mean and unit variance.

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
Fortranner
  • 586
  • 2
  • 12
  • 4
    Simply by rescaling the variables, you will change the weights (provided the underlying software does not automatically scale them). Thus, there's no new algorithm needed, nor any new statistical procedures. What kind of research results are you looking for, then? – whuber Jun 10 '19 at 18:13
  • 1
    Possible duplicate of [Lasso penalty only applied to subset of regressors](https://stats.stackexchange.com/questions/307099/lasso-penalty-only-applied-to-subset-of-regressors) – EdM Jun 10 '19 at 18:36
  • @whuber If diffferent penalty factors are to be used for different variables, is there a data-driven way, such as cross-validation, to choose them? – Fortranner Jun 10 '19 at 19:27
  • 1
    How do you distinguish this from estimation of the parameters in the first place? – whuber Jun 10 '19 at 19:35

0 Answers0