0

I have a regression problem where I don't want the coefficients to be negative. Is setting negative coefficients of OLS to zero the same as constraining the coefficient to be non-zero and solving it through a convex optimiser as a quadratic problem?

I've thought about this from a geometric perspective. In OLS, $\beta^TX$ is the shadow of $y$ onto column space of $X$. The solution is where $y - \beta^TX$ is perpendicular to $X$. Thus, $y - \beta^TX$ is minimised. If one of the coefficients in $\beta$ is constrained (by replacing a negative value with zero, call this new vector $\theta$), $y - \theta^TX$ is still the vector with the smallest norm?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
stevew
  • 749
  • 3
  • 12
  • This can be solved with quadratic optimization. – user2974951 Jan 13 '21 at 06:56
  • I fixed your title – kjetil b halvorsen Jan 21 '21 at 22:06
  • See https://stats.stackexchange.com/questions/30565/how-to-obtain-covariance-matrix-for-constrained-regression-fit https://stats.stackexchange.com/questions/41168/constrained-regression-in-r-coefficients-positive-sum-to-1-and-non-zero-interc https://stats.stackexchange.com/questions/87559/least-square-regression-with-l1-regularization-and-non-negativity-constraint – kjetil b halvorsen Jan 21 '21 at 22:23

0 Answers0