0

I'm training my model with 20 features and getting an accuracy of 0.95, however, If I check p-values with a significance level=0.05, I'm left with only 7 features which has brought down my accuracy to 0.90.

Why is accuracy going down, isn't high p-value suppose to mean not so important feature? Also, I'm confused If I should check p-values or Variance Inflation Factor to chose my best features.

  • Related: [Why is accuracy not the best measure for assessing classification models?](https://stats.stackexchange.com/q/312780/1352) – Stephan Kolassa Jan 21 '18 at 18:18
  • Additionally, No. P-values do not at all (or, if you're feeling generous, only weakly) indicate whether a predictor is useful in a predictive model. They simply measure a different thing (*if* the true parameter is zero, what is the probability we would observe a parameter estimate this extreme). If variable selection is needed, there is technology available to do so, but p-values are not one of those technologies. – Matthew Drury Jan 21 '18 at 19:53

0 Answers0