In page 298 of Rosenthal's Probability of statistics, it says:
We are only interested in likelihood ratios $\frac{L(\theta_1 | s)}{L(\theta_2 | s)}$ for $\theta_1, \theta_2 \in \Omega$ when it comes to determining inferences for $\theta$ based on the likelihood function. This implies that any function that is a positive multiple of $L(\cdot |s)$ , i.e. $L^*(\cdot |s) = cL(\cdot |s)$ for some fixed $c>0$, can serve equally well as a likelihood function.
Question: I'm not sure why being interested in the ratios allows us to use any positive multiple of a likelihood function. For example: If I use $L(\theta_1 | s)$ and $L(\theta_2 | s)$, such that for given data $s_0$, I get $L(\theta_1 | s_0) = 0.1$ and $L(\theta_2 | s_0) = 0.3$, then the distribution associated with $\theta_2$ is more likely to have generated the data.
But if instead of $L(\theta_1 | s)$ I use $cL(\theta_1 | s)$ where $c = 5$, and so $\frac{cL(\theta_1 | s_0)}{L(\theta_2 | s_0)} = \frac{5}{3}$, then the distribution associated with $\theta_1$ is more likely to have generated the data.