3

I've read a number of different explanations trying to understand the likelihood function, and I understand the purpose of it, but some statements sound contradictory.

Consider observed data X, model parameters M, likelihood function L(M|X), and probability P(X|M).

I keep seeing it written that L(M|X) = P(X|M). At the same time, they say that the likelihood is not the same thing as the probability and L is not a probability density function (pdf).

What does it mean to say they are not the same thing but equal? How could L be equal to a pdf, but not be a pdf?

Reference to one of the places I've read this: http://www.stat.cmu.edu/~larry/=stat705/Lecture6.pdf

Fantasy
  • 61
  • 5

1 Answers1

4

We say that the likelihood is not the same thing as the probability because, $\mathcal{L} (M\mid X )\neq P(M \mid X) $.

But it is true that $\mathcal{L} (M\mid X )= P(X \mid M) $. So why do we say that the likelihood is not a pdf? This is because the likelihood is seen as a function of $M$ rather than of $X$. As a function of $M$, the likelihood does not obey the laws of probability: the integral of the likelihood over the parameter space can be greater than 1.

If you want to define a pdf on the parameter space, you need to use Bayes' theorem: $P(M \mid X) = \frac{\mathcal{L} (M\mid X )\times P(M)}{P(X)}$

For a more complete view, have a look at What is the difference between "likelihood" and "probability"?

Adrien Renaud
  • 670
  • 3
  • 8