For questions related to CRAN R package selectiveInference and high dimensional inference for LASSO regularized regression models
Questions tagged [selectiveinference]
9 questions
22
votes
3 answers
Using regularization when doing statistical inference
I know about the benefits of regularization when building predictive models (bias vs. variance, preventing overfitting). But, I'm wondering if it is a good idea to also do regularization (lasso, ridge, elastic net) when the main purpose of the…

user162381
- 291
- 3
- 8
16
votes
2 answers
Testing for coefficients significance in Lasso logistic regression
[A similar question was asked here with no answers]
I have fit a logistic regression model with L1 regularization (Lasso logistic regression) and I would like to test the fitted coefficients for significance and get their p-values. I know Wald's…

Pablo
- 311
- 1
- 2
- 9
4
votes
2 answers
Is post-selection inference a problem when robust tests are used?
It's pretty well acknowledged that error control via p-values fails when models are selected based on the data rather than decided on a priori. I've always viewed this as an issue of marginal vs conditional distributions, such that:
$$P(error) =…

Josh Pritsker
- 53
- 7
4
votes
0 answers
How much of a problem is inference after model selection when few models are manually compared?
tl;dr: I found a better model than the one I first thought of while inspecting the data and performed a few steps of variable selection/model fine-tuning. I assume that this is a (mild) case of inference after model selection.
I performed an…

jkd
- 292
- 2
- 12
3
votes
0 answers
Inference for quasibinomial GLM with LASSO penalty using selectiveInference package
I would like to carry out inference on a binomial LASSO model, but take into account the fact that my data are overdispersed and use the quasibinomial family instead.
R package selectiveInference, which does inference for LASSO models, only seems…

Tom Wenseleers
- 2,413
- 1
- 21
- 39
2
votes
1 answer
Is Individual Coefficient Significance with Ridge or Lasso possible, when Amount of Variables exceeds Observations
First, to introduce you to my situation, I have a dataset containing n = 16 observations and p = 17 variables. My variable set contains 16 independent variables (14 variables I'm interested in and two serving as control variables) and one outcome…

Frank_Crunch
- 23
- 3
2
votes
0 answers
Hypothesis Testing on Coefficients post LASSO Variable Selection (in R)
Is it possible to run a hypothesis test to test for a significant difference in calculated coefficients of the same independent variable in two different subsets of one population, after you have performed a LASSO selection process on each subset…

Green90
- 45
- 2
1
vote
0 answers
Effect size after using elastic net and selective inference
I've employed Elastic net to fit a logistic model with predictors that displayed high degrees of correlation between themselves. I wanted to be able to see which predictors significantly influenced the models predictions, so I employed the…

GCO
- 11
- 2
1
vote
1 answer
Bias in P-value of MM-type estimators or Cochrans Q Penalized Regression
There are a number of linear regression methods designed to limit the influence of outliers on estimates:
For example, Cochrans Q Penalised regression as described in [1] will do an initial linear regression, then downweighting data points which…

par
- 187
- 8