Lets' say I have a Poisson distribution for which I use a maximum likelihood defined as:
$P_i=\frac{m_i^{n_i}}{e^{m_i}n_i!}$
where $m_i$ represents the model value of bin $i$ (real, $m_i$> 0) and $n_i$ its observed value (integer, number of counts in bin $i$). The cumulative likelihood for the whole data set is then:
$L=\prod\limits^{N}_{i=1} \frac{m_i^{n_i}}{e^{m_i}n_i!}$
whwre $N$ is the total number of bins.
Now, as I understand it, $P_i$ denotes the probability that the observation $n$ is drawn from model $m$.
(1) So a bigger likelihood value $L$ means a better fit between the model and the data (Is this correct?)
What I don't get is the necessity for a maximum likelihood ratio, defined as
(2) the ratio of probability of drawing $n_i$ points from model $m_i$, from that of drawing $n_i$ points from model $n_i$ (Is this correct?)
whose cumulative form is:
$LR= \prod\limits^{N}_{i=1} \frac{\frac{m_i^{n_i}}{e^{m_i}n_i!}}{\frac{n_i^{n_i}}{e^{n_i}n_i!}} = \prod\limits^{N}_{i=1} \left(\frac{m_i}{n_i}\right)^{n_i}e^{n_i-m_i}$
(3) Won't maximizing this value $LR$ give me the same results as maximizing $L$?
Finally:
(4) How can I calculate the p-value associated with a Poisson likelihood $L$ or a Poisson likelihood ratio $LR$ for a model($m$)-observation($n$) analysis?