Questions tagged [lars]

LARS stands for Least Angle Regression. It is a penalized estimation and feature selection technique for multiple regression.

LARS is an extension of the LASSO, which constrains regression coefficients to no more than a possible absolute sum. The LARS algorithm can be understood as reestimating the regression model step by step while slowly relaxing the LASSO constraint.

The result of this is analogous to a forward selection selection algorithm in that the first variable included would be the one most strongly associated with the response, and as the constraint is relaxed, additional variables would be included in descending order of their strength of association.

51 questions
69
votes
5 answers

What problem do shrinkage methods solve?

The holiday season has given me the opportunity to curl up next to the fire with The Elements of Statistical Learning. Coming from a (frequentist) econometrics perspective, I'm having trouble grasping the uses of shrinkage methods like ridge…
Charlie
  • 13,124
  • 5
  • 38
  • 68
44
votes
5 answers

Using LASSO from lars (or glmnet) package in R for variable selection

Sorry if this question comes across a little basic. I am looking to use LASSO variable selection for a multiple linear regression model in R. I have 15 predictors, one of which is categorical(will that cause a problem?). After setting my $x$ and $y$…
James
  • 441
  • 1
  • 5
  • 4
27
votes
2 answers

Advantages of doing "double lasso" or performing lasso twice?

I once heard a method of using the lasso twice (like a double-lasso) where you perform lasso on the original set of variables, say S1, obtain a sparse set called S2, and then perform lasso again on set S2 to obtain set S3. Is there a methodological…
Bstat
  • 791
  • 1
  • 7
  • 5
15
votes
2 answers

LASSO/LARS vs general to specific (GETS) method

I have been wondering, why are LASSO and LARS model selection methods so popular even though they are basically just variations of step-wise forward selection (and thus suffer from path dependency)? Similarly, why are General to Specific (GETS)…
13
votes
2 answers

Exact definition of Deviance measure in glmnet package, with crossvalidation?

For my current reseach I'm using the Lasso method via the glmnet package in R on a binomial dependent variable. In glmnet the optimal lambda is found via cross-validation and the resulting models can be compared with various measures, e.g.…
Jo Wmann
  • 153
  • 1
  • 6
12
votes
1 answer

R - Lasso Regression - different Lambda per regressor

I want to do the following: 1) OLS regression (no penalization term) to get beta coefficients $b_{j}^{*}$; $j$ stands for the variables used to regress. I do this by lm.model = lm(y~ 0 + x) betas = coefficients(lm.model) 2) Lasso regression…
Dom
  • 353
  • 3
  • 9
10
votes
1 answer

LASSO regularisation parameter from LARS algorithm

In their seminal paper 'Least Angle Regression', Efron et al describe a simple modification of the LARS algorithm which allows to compute full LASSO regularisation paths. I have implemented this variant sucessfully and usually plot the output path…
Quantuple
  • 1,296
  • 1
  • 8
  • 20
8
votes
1 answer

LARS - LASSO with weights

I am interested in solving the following problem $$ \min_{\boldsymbol{\beta}} \left( \mathbf{y}-\mathbf{X}\boldsymbol{\beta} \right)^T W \left( \mathbf{y}-\mathbf{X}\boldsymbol{\beta} \right) + \lambda \left|\boldsymbol{\beta}\right|_1…
Meenakshi
  • 391
  • 2
  • 9
5
votes
1 answer

What is the rationale behind LARS-OLS hybrid, i.e. using OLS estimate on the variables chosen by LARS?

I need some help to understand the relationship between the ranking of the variables from the LARS algorithm and the use of OLS to estimate the final model chosen by the LARS. I understand that the LARS algorithm is less greedy than forward…
Guest
  • 51
  • 1
5
votes
1 answer

Why is there no intercept in the lars output for LASSO in Stata?

This is my first time using lars, so this question is probably obvious. When I run lars on my data I get an output with a model and coefficients assigned to predictors, but there is no intercept. I thought part of how LASSO worked was shrinking…
EvKohl
  • 1,090
  • 2
  • 10
  • 14
5
votes
0 answers

What is least angle regression?

Conceptually, I don't understand what least angle regression Least Angle Regression (LARS) is and why it solves LASSO (pdf). We know that LASSO is: $$\arg \min_x {\left\| A x - y \right\|}_{2}^{2} + \lambda {\left\| x \right\|}_{1}$$ From my…
user3591466
  • 101
  • 1
  • 3
4
votes
1 answer

Classification with Least Angle (LARS)-type algorithm?

I am currently working on the LARS (Least Angle Regression) method. I know it is a regression method, but I wonder if, like LASSO or Ridge techniques (e.g. the package gmlnet in R), it can be modified in order to achieve classification rather than…
Anoikis
  • 41
  • 5
4
votes
1 answer

Why under joint least squares direction is it possible for some coefficients to decrease in LARS regression?

I think I understand how LARS regression works. It basically adds features to the model when they are more correlated with the residuals than the current model. And then, after adding the features to the model, it will increase the coefficients in…
makansij
  • 1,919
  • 5
  • 27
  • 38
4
votes
1 answer

Computational complexity of the lasso (lars vs coordinate descent)

The lasso can be computed with the LARS or Coordinate Descent algorithm. What is their computational complexity and when one is quicker than the other?
Donbeo
  • 3,001
  • 5
  • 31
  • 48
4
votes
1 answer

Feature selection with k-fold cross-validated least angle regression

I am using the least angle regression (LARS) to extract the most important predictors ($x_1, x_2,...,x_p$) for my response variable ($y$). I have seven predictors ($x_1,x_2,...,x_7$) for each response variable. I did 10-fold cross validation by…
Biostat
  • 1,791
  • 2
  • 19
  • 21
1
2 3 4