I've conducted a multiple linear regression on a particular outcome variable with one covariate (block 1) and five related but independent predictors (block 2). The ANOVAs for both models are significant (P=0.01, P=0.026), and the covariate in the first model is significant (P=0.01). However, in the second model, even though there is an R-squared change of 21.9%, the covariate becomes non-significant (P=0.197) and none of the five predictors are significant either (0.07
I've been searching the web to try and figure out why this might have happened, and it seems like multicollinearity is the most common reason. So I ran collinearity diagnostics, and there isn't any evidence of multicollinearity - tolerance for predictors are all comfortably above 0.1, and VIF way below 10. Correlations between predictors are <0.504.
So can anyone explain what my problem is here, and how best to continue my analysis and report the data? Also, if possible, could it be explained in laymans terms? I'm just a lowly psychology student and I get confused by statistical jargon. I actually aced all regression questions on my statistics exams and have never found it difficult until now - using real-world data makes things much trickier apparently! Thanks in advance!