0

Given a ridge regression variant function where $y \in \mathbf{R^n}, s \in \mathbf{R^d}, X \in \mathbf{R^{dxn}}$ and $C$ a regularization parameter $\in \mathbf{R}$

$w_{crr}= \frac{1}{2}||w||^2 +C||y - w^T.X||^2 + (s^Tw)^2$

$w_{rr}= \frac{1}{2}||w||^2 +C||y - w^T.X||^2 $

How is this function differs geometrically than basic ridge regression function? and how will it affect the optimization process?

christopher
  • 430
  • 2
  • 10
  • 1
    Adopting the notation I used at https://stats.stackexchange.com/a/164546/919, consider the augmented design matrix $$X^{*} = \pmatrix{X\\\nu I\\s^\prime}.$$ – whuber Sep 06 '20 at 16:03
  • @whuber I read your another view! so how does augmenting $s'$ changes the interpretation of ridge regression? Just like as you mentioned!! adding $vI$ could help us remove collinearity. – christopher Sep 06 '20 at 20:16
  • 1
    I don't think it changes the interpretation at all: it just changes the relative weights on the coefficients of $w.$ – whuber Sep 06 '20 at 20:43
  • @whuber interesting and how will that change in weights affect our optimization problem? if we would like to view it in terms of overfitting!! especially! – christopher Sep 06 '20 at 20:50
  • 2
    The point of expressing Ridge Regression in terms of $X^{*}$ is that it demonstrates there is no change in the mathematical nature of the optimization: it's still a least squares solution. – whuber Sep 06 '20 at 21:35

0 Answers0