Bayesian binary hypothesis testing problem in equal priors and uniform cost assignment (i.e., $c_{ij}=1$ for $i \neq j$ otherwise $0$) setting:
Under hypothesis $\mathrm{H}_0$, the observation $Y$ is distributed with $p_0(y)$. Under hypothesis $\mathrm{H}_1$, first a fair coin is flipped and if it is tails, $Y$ is distributed with $p_{11}(y)$ and if it is heads, it is distributed with $p_{12}(y)$.
I think this question represents the case where a simple hypothesis ($\mathrm{H}_0$) is compared against a composite hypothesis ($\mathrm{H}_1$). To solve this question, I cannot just use the likelihood ratio test: $$\displaystyle\frac{p_0(y)}{p_1(y)}$$ since $p_1(y)$ is not a single distribution.Furthermore, I know that there is a generalized version of the ratio test as (as long as uniform cost assignment is valid): $$\frac{P(y|\Theta \in \Lambda_1)}{P(y|\Theta \in \Lambda_0)}$$ where $\Theta$ represents the different distributions that $Y$ can have under each hypothesis.
What I think the solution might be is: Under $\mathrm{H}_0$, $P(y|\Theta \in \Lambda_0)$ reduces to $p_0(y)$. But I am stuck at the evaluation of $P(y|\Theta \in \Lambda_1)$. I think I somehow need to find a relation between $\Lambda_0$ and $\Lambda_1$. However, I don't know how I can do that. Any help will be appreciated. Thanks in advance!
BTW, if my notation is confusing and if any point is unclear, I will do my best to clarify it.