We are given $P(G)=15\%$ prior probability and are looking for $P(G|data)$ posterior probability.
By Bayes' Theorem we have
$P(G|data) = \frac{P(data|G)P(G)}{P(data)}$
By law of total probability we have
$P(G|data) = \frac{P(data|G)P(G)}{P(data|G)P(G) + P(data|Y)P(Y)}$
Observe that the data is that there are 4 correct and 1 incorrect if it was green, and vice-versa (usage?) otherwise. Hopefully, one can see the relevance of the binomial distribution here.
Thus,
$P(data|G) = \binom{5}{4}(2/3)^4(1/3)^1 \tag{**}$
Appendix
Actually $$data := \{\text{W1 reports Y} \cap \text{W2 reports G} \cap \text{W3 reports G} \cap \text{W4 reports G} \cap \text{W5 reports G}\}$$
$$\cup \{\text{W1 reports G} \cap \text{W2 reports Y} \cap \text{W3 reports G} \cap \text{W4 reports G} \cap \text{W5 reports G}\}$$
$$\cup \dots \cup \{\text{W1 reports G} \cap \text{W2 reports G} \cap \text{W3 reports G} \cap \text{W4 reports G} \cap \text{W5 reports Y}\}$$
Thus, if the car is green then
$$data = data_a \tag{*}$$
where
$$data_a := \{\text{W1 is not right} \cap \text{W2 is right} \cap \text{W3 is right} \cap \text{W4 is right} \cap \text{W5 is right}\}$$
$$\cup \{\text{W1 is right} \cap \text{W2 is not right} \cap \text{W3 is right} \cap \text{W4 is right} \cap \text{W5 is right}\}$$
$$\cup \dots \cup \{\text{W1 reports G} \cap \text{W2 reports G} \cap \text{W3 is right} \cap \text{W4 is right} \cap \text{W5 is not right}\}$$
and $(*)$ is showing the equivalence of the events which is
$\forall \omega$, sample points $\in \Omega$ the sample space, we have
$$\omega \in data \iff \omega \in data_a$$
if $\omega \in G$
where
$G, Y, data, data_a, \{\text{Wi reports G}\}, \{\text{Wi reports Y}\}$ are events, that is, subsets of the sample space $\Omega$.
So really, the justification for $(**)$ is that
$P(data|G) = P(data_a|G)$
because
$P(data \cap G) = P(data_a \cap G) \tag{***}$
because of (*).
Observe that if $\omega \in Y$, then $(***)$ is simply $0=0$