1

In page 298 of Rosenthal's Probability of statistics, it says:

We are only interested in likelihood ratios $\frac{L(\theta_1 | s)}{L(\theta_2 | s)}$ for $\theta_1, \theta_2 \in \Omega$ when it comes to determining inferences for $\theta$ based on the likelihood function. This implies that any function that is a positive multiple of $L(\cdot |s)$ , i.e. $L^*(\cdot |s) = cL(\cdot |s)$ for some fixed $c>0$, can serve equally well as a likelihood function.

Question: I'm not sure why being interested in the ratios allows us to use any positive multiple of a likelihood function. For example: If I use $L(\theta_1 | s)$ and $L(\theta_2 | s)$, such that for given data $s_0$, I get $L(\theta_1 | s_0) = 0.1$ and $L(\theta_2 | s_0) = 0.3$, then the distribution associated with $\theta_2$ is more likely to have generated the data.

But if instead of $L(\theta_1 | s)$ I use $cL(\theta_1 | s)$ where $c = 5$, and so $\frac{cL(\theta_1 | s_0)}{L(\theta_2 | s_0)} = \frac{5}{3}$, then the distribution associated with $\theta_1$ is more likely to have generated the data.

Snowball
  • 131
  • 4
  • In your example the denominator is also scaled. The answer above gives good examples of why “scaling” (may be a better word here) can be done in your likelihood-ratio case and others likelihood uses. – Single Malt Sep 12 '20 at 18:02
  • @SingleMalt Can you elaborate on "in your example the denominator is also scaled"? Do you mean that when they say you can multiple by some c>0, they mean it must be done for both numerator AND denominator in the likelihood ratio? In other words, EVERY likelihood function $f_\theta(s)$ must be scaled? – Snowball Sep 12 '20 at 18:14
  • 1
    The conclusion in the quotation is straightforward: according to laws of arithmetic, multiplying $L$ by any positive $c$ does not change the likelihood ratio. Your example following "if instead" makes no sense because you haven't consistently scaled $L$ in your comparison. – whuber Sep 12 '20 at 18:23
  • Yes, for a likelihood ratio both would be scaled as per the comment by @whuber. – Single Malt Sep 12 '20 at 18:42
  • Thanks. I think I know where my confusion was. I think I misunderstood the definition of the likelihood function. – Snowball Sep 12 '20 at 18:45
  • See Rosenthal, Example 6.1.2. The likelihood function is $$L(\theta\mid 4)=\binom{10}{4}\theta^4(1-\theta)^6$$ Put $\theta_1=0.4$, $\theta_2=0.6$. The likelihood ratio is $$\frac{\binom{10}{4}0.4^4(1-0.4)^6}{\binom{10}{4}0.6^4(1-0.6)^6}=\frac{0.4^4(1-0.4)^6}{0.6^4(1-0.6)^6}=2.25$$ Here $c$ is $1/\binom{10}{4}$. In other words, you can cancel $\binom{10}{4}$ because it is a constant factor which _does not depend on $\theta$_. – Sergio Sep 12 '20 at 21:13

0 Answers0