0

I have a weighted least square model and I wanted to calculate $R^2$ manually, but my results don't match the R summary. Why is that?

> head(weights(m))
[1] 0.973411 0.408694 0.414370 0.426554 0.357159 0.407986
> yhat = predict(m)
> y = as.numeric(model.frame(m)[,1])
> SStot = sum((y - mean(y))^2)
> SSres = sum((y - yhat)^2)
> rsq = 1 - (SSres / SStot)
> rsq
[1] 0.065211
> summary(m)$r.squared
[1] 0.0433978

The same formulas give me the correct $R^2$ when using ordinary least square.

Robert Kubrick
  • 4,078
  • 8
  • 38
  • 55
  • 1
    Could you point out where you are manually computing a *weighted* $R^2$? I don't see any use of the weights at all in your example. – whuber Aug 19 '15 at 18:45
  • I don't. I thought the formula was the same. So if weights have to be used, I'm not clear why. $R^2$ quantifies the explained variance. Why would we care about how the observations were weighted? – Robert Kubrick Aug 19 '15 at 19:17
  • If you don't care about how the observations are weighted, then why are you specifying weights in your model? Ultimately, the correct formula to use depends on what the weights mean. What do you intend them to represent? – whuber Aug 19 '15 at 21:25
  • I care about the weights only to calculate the coefficients. Once my estimates have been calculated I want to relate different models (WLS or OLS) using the same formula. – Robert Kubrick Aug 20 '15 at 12:02
  • For reference, this [thread](http://stats.stackexchange.com/questions/83826/is-a-weighted-r2-in-robust-linear-model-meaningful-for-goodness-of-fit-analys) answers my question. – Robert Kubrick Oct 21 '15 at 20:49
  • Thank you for finding that, Robert. It clarifies your question, too. I am glad you found an answer here. – whuber Oct 21 '15 at 21:21

0 Answers0