3

Will the signs of coef (Estimate) of lm and glm always be the same? ^

According to below toy example, it seems yes. Can you provide a case where they might be different? (If it matters in my real data the outcome is binary, hence used mpg > 20)

# dummy data
d <- mtcars

# fit lm, glm, glm_bi
fit_lm <- lm(mpg > 20 ~ cyl + disp, data = d)
fit_glm <- glm(mpg > 20 ~ cyl + disp, data = d)
fit_glm_bi <- glm(mpg > 20 ~ cyl + disp, family = binomial, data = d)

# Signs are always same?
# lm compared to glm
all.equal(sign(coef(fit_lm)),
          sign(coef(fit_glm)))
# output
# [1] TRUE

# lm compared to glm(family = binomial)
all.equal(sign(coef(fit_lm)),
          sign(coef(fit_glm_bi)))

# output
# [1] TRUE

^ Very much sounds like a dupe, found this similar post: Sign of coefficients in linear regression vs. the sign of correlation. Let me know if this is a dupe.

zx8754
  • 256
  • 8
  • 21
  • 4
    Once you realize that many GLMs effectively *weight* the data according to the inverse variance of the response, then you see that a more meaningful comparison would be between a GLM and a weighted OLS solution. That also shows how to construct counterexamples to your conjecture. – whuber Jun 08 '16 at 14:31
  • 2
    Do you ask only about GLM for Gaussian family, or *any* GLM model compared to LM? – Tim Jun 08 '16 at 14:34
  • @Tim sorry, I should have mentioned, for glm it is binomial logit link. I will update the post when I get a chance. – zx8754 Jun 08 '16 at 15:25
  • @whuber thank you, do you mind posting as answer with more detail in layman terms (if at all possible)? My stats knowledge is limited. – zx8754 Jun 08 '16 at 15:31

1 Answers1

1

@whuber:

Once you realize that many GLMs effectively weight the data according to the inverse variance of the response, then you see that a more meaningful comparison would be between a GLM and a weighted OLS solution. That also shows how to construct counterexamples to your conjecture.

zx8754
  • 256
  • 8
  • 21