2

Say that data is assumed to be exponentially distributed with mean $\zeta$. The likelihood function then incldues the factor $\exp{- \sum x_i/\zeta}$. For a big data set, this is potentially a really small value. How does one then deal with things like the likelihood ratio test, in which we need to take the ratio of such really small numbers?

I've tried R and it doesn't do it properly. It just returns NaN.

Qausi
  • 21
  • 1

1 Answers1

1

The problem comes from using the likelihood rather than the log likelihood: since$$\ell(\zeta|x_1,\ldots,x_N)=\log L(\zeta|x_1,\ldots,x_N)=n\log(\zeta)-\zeta\sum_i x_i$$you can manipulate larger sample sizes this way, at no loss from an inference point of view.

Xi'an
  • 90,397
  • 9
  • 157
  • 575