1

What are some reasons that regression parameter $SE$s can decrease when predictors are removed from the model?

Generally, adding predictors reduces the model's variance (MSE) which should shrink the $SE_\hat{\beta}$.

However, sometimes the $SE$s get bigger. What could cause this (in a linear or logistic model)?

Michael Webb
  • 1,936
  • 10
  • 21
  • Adding predictors reduces SSE surely, not sure reduces MSE. So sometimes, excluding predictors from model reduces MSE, which should shrink the SE$(\hat β). – user158565 Oct 30 '18 at 15:10
  • 2
    Because the question of SEs is (almost) equivalent to the question of significance of the parameters, you are asking why the significance of a parameter can decrease when variables are added. From this perspective, https://stats.stackexchange.com/questions/27257/ looks like a duplicate. – whuber Oct 30 '18 at 15:42
  • handwavingly, I'd expect this. I'd think of adding predictor variables as somehow reducing (effectively), the size of your training data. In general, to make equally valid inferences, when you add more features, you need more data. As you add more features and keep the size of the data constant, you can be less sure that the relationships you're seeing are real and not spurious. – gazza89 Oct 30 '18 at 17:39
  • 1
    Searching on [***multicollinearity***](https://stats.stackexchange.com/search?tab=votes&q=multicollinearity) will locate many relevant posts – Glen_b Oct 31 '18 at 01:38

0 Answers0