0

Similar as in this thread, I have a problems with small probabilities in likelihoods, but I believe the thread does not apply to me.

The likelihood is a product of probabilities. The log-likelihood is a sum of the log of probabilities. in R I use optim for optimization of a log-likelihood. It then happens that for 'bad' choices of the parameters the log-likelihood contribution (log probability) for some observation(s) is numerically identical to zero in R (but in fact is larger than zero by an infinitesimal small amount). So the sum of the log-likelihood contributions is negative infinity (due to few observations giving this contribution) indicating a bad fit but optimization is not possible anymore.

What is the standard way of dealing with this problem?

tomka
  • 5,874
  • 3
  • 30
  • 71
  • See https://stats.stackexchange.com/search?q=log+overflow+likelihood. – whuber Jun 11 '19 at 18:28
  • @whuber I'd like to come back to this question one more time. I have read all threads but somehow my problem seems not to be addressed. They seem to discuss the problem of the log of sums of exponential functions. My problem is that $\sum_i \log p(y_i|\theta)$ is minus infinite because for one $i$ $p(y_i|\theta)$ numerically underflows to zero. Is this question really addressed and where can I find the solutions? – tomka Feb 20 '20 at 19:23
  • I agree yours is a slightly different problem. I had to deal with it in a code example recently; my solution is in the function `Lambda` at https://stats.stackexchange.com/a/449216/919 (and briefly discussed in the comments beneath that post). One way to conceive of your problem is to express $p(y_i\mid\theta)=\exp(f(y_i\mid\theta))$ for the log probability $f.$ Your case is where $f$ is very negative and you get underflow. By casting your problem in those terms, all the other solutions apply. – whuber Feb 20 '20 at 19:33
  • 1
    @whuber thanks I think this will solve it – tomka Feb 20 '20 at 20:23

0 Answers0