2

I understand that coefficient correlations can arise in the presence of correlated covariates, essentially indicating that our inference for those parameters is coming from information that cannot be disentangled, which in turns alters the shape of our confidence regions.

In the classic linear regression model I'm also aware that the intercept and the coefficient have a correlation that is bounded given the data, as mentioned in this question.

Therefore parameter correlations would seem to be a byproduct of the information-sharing restrictions imposed by a combination of the data and the model we're using, but have no relationship to any "real" feature of the phenomenon being studied? Is there any situation where inference on those correlations would be of primary interest?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
overdisperse
  • 583
  • 3
  • 18
  • This question may provide some insight for a special case: https://stats.stackexchange.com/questions/171125/correlation-between-ols-estimators-for-intercept-and-slope/171134#171134 – Christoph Hanck Sep 21 '21 at 08:43
  • 1
    This question appears predicated on a terminological confusion. Parameters are *not* correlated, nor can they be, because they are just numbers. The correlations to which you refer are among their *estimates.* Those estimates depend (fundamentally) on the choices of explanatory variables and on the number of observations. Thus, there is no form of "inference on those correlations" that has meaning for the parameters themselves. – whuber Sep 21 '21 at 13:13
  • @whuber that makes perfect sense and I would gladly accept it as an answer. – overdisperse Nov 08 '21 at 06:09
  • 1
    Thank you--but I believe the [answer in the thread you referenced](https://stats.stackexchange.com/a/127336/919) already addresses this point clearly and adequately: " $\beta_0$ and $\beta_1$ are not random variables and thus have no covariance. They are fixed (and unobserved) values." – whuber Nov 08 '21 at 14:14

0 Answers0