4

Can I implement ridge regression in terms of OLS regression? Is it even possible?

I am interested because scikit-learn supports non-negative least squares (NNLS), but not non-negative ridge regression. So, I'd like to transform my data as to be able to call the underlying NNLS function, but achieve ridge regression functionality.

amoeba
  • 93,463
  • 28
  • 275
  • 317
The Baron
  • 611
  • 1
  • 6
  • 16
  • 3
    Yes. See e.g. whuber's answer in http://stats.stackexchange.com/questions/69205. – amoeba Mar 25 '16 at 16:12
  • 2
    Also see [here](http://stats.stackexchange.com/questions/137057/phoney-data-and-ridge-regression-are-the-same/137072#137072) – Glen_b Mar 25 '16 at 16:29
  • How about a cross-validated form of ridge, calculating the best lambda? (penalty coefficient) – The Baron Mar 25 '16 at 16:30
  • 2
    Please see the answer in this [thread](http://stats.stackexchange.com/questions/203685), I think it addresses all your questions. – usεr11852 Mar 26 '16 at 03:55
  • What's maybe not quite clear from that linked thread is which nnls solution mentioned would give you the correct result. There is two answers there, one proposing to use `lmridge_nnls = function (X, Y, lambda) nnls(A=crossprod(X)+lambda*diag(ncol(X)), b=crossprod(X,Y))$x` (which I think is the correct solution), and another option which is `lmridge_nnls_rbind = function (X, Y, lambda) nnls(A=rbind(X,sqrt(lambda)*diag(ncol(X))), b=c(Y,rep(0,ncol(X))))$x` (which I think is for the nonnegativity constrained case not quite correct; the the unconstrained case the two would be equivalent). – Tom Wenseleers Aug 29 '19 at 19:06

0 Answers0