When performing linear regression, GLMnet apparently standardizes the dependent variable ($y$) vector to have unit variance before it runs the regression, and then unstandardizes the resulting intercept and coefficients. I assume the standardization is achieved by dividing each $y_i$ by the standard deviation of the $y$ vector.
If I run glmnet with a pre-standardized $y$ how do I unstandardize the resulting equation?
(Note that I am currently running my program/GLMnet on pre-standardized x variables, so I don't have to worry about reversing the x variable standardization that GLMnet also performs.)
I thought that I could simply unstandardize by multiplying each coefficient and the intercept by the standard deviation of the $y$ vector. This does not work - the "unstandardized" equation does not match the result I get when I run glmnet with the same non-standardized $y$. The only time multiplying by the standard deviation works is when I run glmnet with lambda=0. (This effectively runs the program as an ordinary least squares fit.)
I am recreating glmnet in another language as an exercise. When I run my program and glmnet on pre-standardized $y$, I get the same result. I do not get the same result when $y$ is not pre-standardized.
My information on standardization comes from the glmnet vignette:
"Note that for “family=gaussian”, glmnet standardizes y to have unit variance before computing its lambda sequence (and then unstandardizes the resulting coefficients); if you wish to reproduce/compare results with other software, best to supply a standardized y first (Using the “1/N” variance formula)."