2

What is the loss function of the Gaussian maximum likelihood estimator in the classical linear regression model?

I see a question asking this, but it seems that we never mention loss function when talking about MLE...

Here is the question: enter image description here

ask
  • 133
  • 1
  • 7
  • 2
    Are you talking about the mean or the variance or both simultaneously? Which parameters are known or unknown affects the likelihood function. What Momo might be thinking is that in simple linear regression the least squares estimates of the slope and intercept are also maximum likelihood, but that doesn't seem to be your question. – Michael R. Chernick Dec 18 '16 at 20:54
  • I don't think you can really say that a loss function determines a maximum likelihood estimate. It really means that you look at the likelihood function and find the highest peak. This can often be done using calculus. – Michael R. Chernick Dec 18 '16 at 20:58
  • I have added the question... – ask Dec 18 '16 at 21:34

1 Answers1

6

In Maximum Likelihood Estimation loss function used is always $-\log \text{P}(y)$ be it regression or not.

It is a proper and a local loss function. When used in the regression setting, the density $\text{P}(y)$ is replaced with the density conditional on $X$:

$-\log \text{P}(y | X)$.

When the model is Gaussian, such as in the basic linear regression setting, it coincides with Ordinary Least Squares (OLS) where squared error loss is used.

This is due to the $e^{-\frac{(y-\beta X)^2}{2\sigma^2}}$ structure in the Gaussian distribution. When taking $-\log$ of that expression, you end up with the squared loss term, as can clearly be observed.

Cagdas Ozgenc
  • 3,716
  • 2
  • 29
  • 55