4

How to prove the equivalence between constrained form and Lagrange form for lasso and ridge regression?

Given lasso (constrained form): $$\underset{\beta}{\min}{(\frac{1}{2N}||y-x\beta||_2^2)} \space subject \space to \space ||\beta||_1 \leq t$$ The Lagrange form: $$\underset{\beta}{\min}{(\frac{1}{2N}||y-x\beta||_2^2)} + \lambda||\beta||_1 $$ I have gone through lots materials and try to understand how these two form are equivalent, but still feel very struggled on how to give a relatively rigorous proof. I guess proof for ridge regression is similar to lasso. So I only post equations for lasso. Any comments that helps would be appreciated.

FantasticAI
  • 417
  • 1
  • 4
  • 12
  • 1
    This is purely mathematics, it has nothing to do with Lasso or any other statistical estimator. It is the fundamental theory of finding the extreme points of a function under constraints using multipliers. This is the literature you should look up. – Alecos Papadopoulos Mar 11 '20 at 02:46
  • I really can't follow the logic behind when finding the extreme points under constraints, could you please provide some intuitive example? Really appreciated – FantasticAI Mar 11 '20 at 03:14

0 Answers0