When working with many input variables, we are often concerned about multicollinearity. There are a number of measures of multicollinearity that are used to detect, think about, and / or communicate multicollinearity. Some common recommendations are:
The multiple $R^2_j$ for a particular variable
The tolerance, $1-R^2_j$, for a particular variable
The variance inflation factor, $\text{VIF}=\frac{1}{\text{tolerance}}$, for a particular variable
The condition number of the design matrix as a whole:
$$\sqrt{\frac{\text{max(eigenvalue(X'X))}}{\text{min(eigenvalue(X'X))}}}$$
(There are some other options discussed in the Wikipedia article, and here on SO in the context of R.)
The fact that the first three are a perfect function of each other suggests that the only possible net advantage between them would be psychological. On the other hand, the first three allow you to examine variables individually, which might be an advantage, but I have heard that the condition number method is considered best.
- Is this true? Best for what?
- Is the condition number a perfect function of the $R^2_j$'s? (I would think it would be.)
- Do people find that one of them is easiest to explain? (I've never tried to explain these numbers outside of class, I just give a loose, qualitative description of multicollinearity.)