4

This is a pretty basic question, but one I am having a hard time finding an answer to. How do you calculate the likelihood of a simple linear model? Like, say, $$y=\beta_0+\beta_1x+e$$ I am working on model selection via AIC and would like to have a better sense of how it works.

edit: So I think I figured it out. When calculating the likelihood of a linear regression you are actually calculating the likelihood of the residuals. In the case that the residuals have a normal distribution the least-squares and maximum-likelihood parameter estimates are mathematically equivalent, but if you have a different distribution of residuals this will not be the case. Is this correct?

Iceberg Slim
  • 261
  • 2
  • 11
  • 1
    This question has answers at http://stats.stackexchange.com/questions/12562 (giving the formula for Gaussian errors), http://stats.stackexchange.com/questions/68596 (Cauchy errors), and http://stats.stackexchange.com/questions/133799 (connection between least squares and log likelihood). It's plausible that you are looking for more guidance or details, but I can't tell. Please feel at liberty to edit your question to differentiate it from those others; that will help guide potential respondents. – whuber Mar 23 '15 at 19:50
  • Yes, I think that's correct. – Richard Hardy Mar 23 '15 at 22:59

0 Answers0