In his book "Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences", J. Cohen says that redundancy in the independent variables of a multiple regression model increases the standard errors of that variables. He shows this giving an example, but I can't find a good explanation. Why is this true? I'd also be happy about a mathematical explanation.
Asked
Active
Viewed 100 times
0
-
1Suppose you did an OLS regression against two explanatory variables which were essentially identical except for tiny rounding issues. You could increase the coefficient for one and decrease the coefficient for the other without any noticeable change in the fit for the dependent variable. So you could not rely on the coefficients you actually see to any extent (though their sum might be meaningful) – Henry Apr 20 '20 at 20:07
-
1See [here](https://stats.stackexchange.com/a/149442/7071) and [here](https://stats.stackexchange.com/a/86277/7071). – dimitriy Apr 20 '20 at 20:22