First, let us define what are prior and posterior: Assume that a variable $x$ "belongs" to a distribution $F$ with parameter $y$. For example, $x$ could be the result of a coin toss, with probability $y$ for getting "head". Given a specific value of $y$, say $y=\theta$ we have enough information to construct the PDF, CDF and everything else related to $x$:
$$P(x=1|y=\theta)=\theta,\qquad P(x=0|y=\theta)=1-\theta$$
Note that these probabilities are conditional on the value of $y$.
But what if the probability $y$ is itself a variable? In general, that's the basic idea behind Bayesian methods. Going on with the coin toss example - before flipping the coin, we take an assumption regarding the probability which the parameter $y$ "comes from". That is, we assume its probability of having certain values prior to conducting our experiment. That's why it is called the prior distribution.
Next, we conduct our experiment and collect the results. We would like to obtain an estimate of the distribution of $y$ given the results of $x$ we have obtained. This could be only done after conducting our experiment. That's why it's called posterior. This is the probability of $y=\theta$ given our results of $x$, so it is denoted $P(y=\theta|x)$.
Now, let's look at the Bayes theorem, which connects $P(X|Y)$ with $P(Y|X)$:

Back to your question: we are provided with $P(Y), P(X|Y)$ so we can find the marginal distribution for the denominator and then the posterior (see my previous solution), which is $\tau_1(X)=P(Y=1|X)=X$.
The loss function is $L(r^*)=E\left[\min\{\tau_1(X),1-\tau_1(X)\}\right]$, but we know nothing about how $E[\tau_1(X)]=E[X]$ looks like. Surprisingly enough, it doesn't matter at all.
In the case of $\tau_1(X)<1-\tau_1(X)$, we take $L(r^*)=E\left[\tau_1(X)\right]$, but there's more: $\tau_1(X)<1-\tau_1(X)$ so $2\tau_1(X)<1$, which means $|2\tau_1(X)-1|=1-2\tau_1(X)$, then:
$$0.5-0.5E[|2\tau_1(X)-1|]=0.5-0.5E[1-2\tau_1(X)]=0.5-0.5+E[\tau_1(X)]=E[\tau_1(X)]$$
...which is exactly $E\left[\min\{\tau_1(X),1-\tau_1(X)\}\right]$. It's easy to show that the complementary case also holds.