0

I am calculating the log-likelihood of multivariate Gaussian distribution. I am getting a positive log-likelihood.

Density function

$$f(x_1,...,x_n)=\frac{1}{\sqrt{(2\pi)^n|\Sigma|}}\text{exp}{\{-\frac{1}{2}(x-\mu)^\text{T} \Sigma^{-1}(x-\mu)\}}$$ where $|\Sigma|$ is positive definite.

Log-likelihood $$\text{log}(L)= - \frac{1}{2}\text{log}(|\Sigma|) - \frac{1}{2} (x-\mu)^{-\text{T}} \Sigma ^{-1} (x-\mu) - \frac{n}{2}\text{log}(2\pi)$$

Suppose

x = [0.0000000 0.9411407 0.9971905 0.9987736 0.9988251]

Mean: $\mu=\vec{0}$

Covariance matrix: $\Sigma$,

K=
            [,1]       [,2]      [,3]       [,4]        [,5]
[1,] 0.842916377 0.61430825 0.2378030 0.04889546 0.005339994
[2,] 0.614308254 0.84291638 0.6143083 0.23780289 0.048895481
[3,] 0.237802952 0.61430825 0.8429164 0.61430817 0.237802952
[4,] 0.048895461 0.23780289 0.6143082 0.84291638 0.614308339
[5,] 0.005339994 0.04889548 0.2378030 0.61430834 0.842916377

covariance matrix is positive definite. We can do Cholesky decomposition: chol(K)

Ignoring the last term in log-likelihood with $2\pi$, it's a normalizing constant.

Calculating log-likelihood,
First term: $- \frac{1}{2}(|\Sigma|)$=2.642797.
Second term: $(x-\mu)^{-\text{T}} \Sigma ^{-1} (x-\mu)$=-1.857256.
First + Second=0.7855412

Why is it positive 0.7855412? Log-likelihood should be a negative number. What is wrong? Thanks!

user13985
  • 836
  • 4
  • 12
  • 20
  • In case it is not perfectly clear that the duplicate is the same question, please note that (1) a likelihood is, [by definition, a probability density;](http://stats.stackexchange.com/questions/2641) and (2) a density exceeding $1$ is equivalent to its log being positive. BTW, your power of $1/\pi$ is too large, the power of $1/2$ is too small (it's wrong to ignore either), and you are missing a factor of $-1/2$ in the argument of the exponential: see http://en.wikipedia.org/wiki/Multivariate_normal_distribution. So possibly your results in this particular case are due only to those errors. – whuber Jun 09 '15 at 17:10
  • Probability is between zero and one and the log of probability is always non-positive. Meanwhile, density is non-negative but can be above one; consequently, its log can be either positive or negative (or zero). – Richard Hardy Jun 09 '15 at 17:10
  • @whuber What do you mean by 1/$\pi$ is too large? Which term are you referring to? I should have typed power of 1/2, now added. But, it's in there in my calculation in R. Either way, I should not be seeing positive log-likelihood, why am I seeing it? – user13985 Jun 09 '15 at 17:37
  • @RichardHardy So, it's ok to have log-likelihood to be greater than 0? I am confused. – user13985 Jun 09 '15 at 17:39
  • Your errors are still in your code (and they haven't entirely been fixed up in the formulas, either: your log density is not equal to the log of the density due to several mistakes). Make a careful comparison with the Wikipedia formula. And yes, a log-likelihood can be greater than $0$ (and often is), exactly as the duplicate thread explains. – whuber Jun 09 '15 at 18:05
  • @whuber Why is it ok to have density greater than 1 (log-like greater than 0)? The thread does not give concrete examples where I can **see** it. – user13985 Jun 09 '15 at 18:31
  • My argument: density refers to the probability of getting something (head in coin toss). Multivariate normal has density less than 1, $0 \le p(x) \le 1$, for all x. Likelihood is multiplying n densities, so, it's even smaller. Taking log of likelihood should be negative. I don't see anything wrong with my argument. – user13985 Jun 09 '15 at 18:49
  • Are you actually linking to the duplicate? The question itself cites a concrete example on Wikipedia. I also posted an answer that starts off with two graphs of PDFs that exceed $1$. I can't think of anything more concrete than that! – whuber Jun 09 '15 at 22:33
  • @whuber Sorry, I didn't know you meant the link in the post. I was looking at the links in the comment. – user13985 Jun 10 '15 at 01:10

0 Answers0