Let's say I fit logistic regression with 2 predictors (x1, x2) and binary outcome (y). When I fit two separate regression y~x1 and y~x2 coefficients for predictors are significantly different from 0, meaning that their CIs don't contain 1 (for Odds Ratios).
However when I fit logistic regression with both parameters together, y~x1+x2, coefficients are no longer significantly different from 1. Why does it happen? How could this be explained?
Here is the particular example: I have variables x1, x2, and I consider predictors as x1 and x2/x1.
summary(glm(as.formula(data2[,1]~data2[,2]), family=binomial(link="logit"),
na.action=na.pass))
gives me:
(Intercept) -1.05526 0.29529 -3.574 0.000352 ***
data2[, 2] 0.24653 0.09686 2.545 0.010917 *
summary(glm(as.formula(data2[,1]~I(data2[,2]/data2[,3)), family=binomial(link="logit"),
na.action=na.pass))
gives me:
(Intercept) 0.43475 0.39187 1.109 0.26725
I(data2[, 3]/data2[, 2]) -0.06889 0.02554 -2.698 0.00699 **
while regression with both variables,
summary(glm(as.formula(data2[,1]~data[,2]+I(data2[,2]/data2[,3)),
family=binomial(link="logit"), na.action=na.pass))
gives me:
(Intercept) -0.2278 0.5740 -0.397 0.692
data2[, 2] 0.1580 0.1065 1.484 0.138
I(data2[, 3]/data2[, 2]) -0.0456 0.0284 -1.606 0.108
Correlation between variables is -0.504 and is significantly different from 0.
Does significantly different from 0 correlation lead to this situation? If so, how could one justify this?