0

Can I conclude I have outliers when no coefficients are significant but f-test is significant in the linear regression?

 Call:
 lm(formula = co[, 12] ~ co[, 2] + co[, 3] + co[, 5] + 
     co[, 7] + co[, 8] + co[, 9] + co[, 11] + co[, 2] * 
     co[, 11])

 Residuals:
     Min      1Q  Median      3Q     Max 
 -5.0542 -1.6286 -0.2886  1.1572 24.6327 

 Coefficients:
                        Estimate Std. Error t value Pr(>|t|)
 (Intercept)           3.7720074  9.0737734   0.416    0.679
 co[, 2]            -0.3272521  0.3601090  -0.909    0.366
 co[, 3]            -0.1776742  0.8922504  -0.199    0.843
 co[, 5]            -0.0507871  0.2318475  -0.219    0.827
 co[, 7]            -0.0415144  0.0633719  -0.655    0.514
 co[, 8]            -0.0001058  0.0005771  -0.183    0.855
 co[, 9]            -1.0555060  7.6677272  -0.138    0.891
 co[, 11]           -0.4332281  1.4533123  -0.298    0.766
 co[, 2]:co[, 11]  0.1683719  0.1859764   0.905    0.368

 Residual standard error: 3.74 on 92 degrees of freedom
 Multiple R-squared:  0.1731,    Adjusted R-squared:  0.1011 
 F-statistic: 2.407 on 8 and 92 DF,  p-value: 0.02095

Then the VIF test to check multicolinearity shows below:

            co[, 2]            co[, 3]            co[, 5] 
           131.188424             1.264675             1.684113 
            co[, 7]            co[, 8]            co[, 9] 
             1.638692             1.356281             1.145669 
           co[, 11]            co[, 2]:co[, 11] 
            49.944243           180.188574 
Eric
  • 434
  • 1
  • 10
  • 27
  • I think this could happen if you have a highly significant constant term but no relationship with any of the predictor variables. – Michael R. Chernick May 05 '17 at 23:58
  • Actually not even the constant term is significant – Eric May 05 '17 at 23:59
  • Can you show me an example with a significant F value and no significant relationship between the dependent variable and the regressors? – Michael R. Chernick May 06 '17 at 00:03
  • Please check above now. – Eric May 06 '17 at 00:06
  • I would suspect (multi)collinearity. When the predictor variables have linear relationships with one another, the individual standard errors are inflated, but the overall model fit can still be strong. – Matt Tyers May 06 '17 at 00:17
  • The answer is "no". It's quite easy to get significant F statistics (include at least one predictor that's highly related to the response) while making all the individual coefficients not significant (e.g. by including predictors highly correlated with the ones related to y). This is discussed in many posts on site. – Glen_b May 06 '17 at 00:18
  • https://stats.stackexchange.com/questions/3549/why-is-it-possible-to-get-significant-f-statistic-p-001-but-non-significant-r – SmallChess May 06 '17 at 00:22
  • I hope that is true but when I run the VIF test to check the multicolinearity issue, it is totally safe all around 1 or 2 only except the interaction variable and it's two components. I added the VIF test result above. Please check. – Eric May 06 '17 at 00:24
  • According to the post you directed me, it seems like to problem is happening with the correlation coming from the interaction variable and its components. Am I right? – Eric May 06 '17 at 17:46

0 Answers0