1

I have a linear regression equation where Y ~ X1 + X2 + X3 + X4. The intercept, X2, and X4 are statistically significant. The multiple r-squared is 0.04696, and adj R-sq is 0.0300.

I am trying to find the proportion of the variance in Y explained by the common part of X1, X2, X3, and X4. I figured that I could subtract the unique variance (i.e., sum of all squared semi-partials) from the R-square value and be left with the shared variance.

Here are my correlations:

           zeroorder partial semipartial
X1         0.062    0.074       0.073
X2         -0.156  -0.173      -0.171
X3         -0.077  -0.083      -0.081
X4         -0.024  -0.142      -0.140

So, I calculate the shared variance = 0.04696 - [(0.073)^2 + (-0.171)^2 + (-0.081)^2 + (-0.140)^2] = 0.04696 - 0.060731 = -0.013771. I'm pretty sure that the sum of squared semipartial correlations is not able to equal more than the regression's R-squared value.

(1) Do I need to only subtract the sum of squared significant semi-partial correlations (i.e., X2 and X4)? (2) Is this method the correct way to calculate the proportion of variance in the outcome variable explained by the common part of all the predictors?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
  • Please see Venn diagram [here](https://stats.stackexchange.com/a/73876/3277), it answers your title question. – ttnphns Sep 17 '18 at 10:49
  • And the example there. You have a suppressional situation, in particular, X4 is the suppressor because its semipartial corr is considerably greater than the zero-order r. – ttnphns Sep 17 '18 at 11:47

0 Answers0