Let $X_1, \dots, X_n$ i.i.d. from $N(\mu, \sigma^2)$; $Y_1, \dots, Y_m$ i.i.d. from $N(\eta, \tau^2)$. $X_1, \dots, X_n$ are independent of $Y_1, \dots, Y_m$. And $\tau^2$ and $\sigma^2$ are considered known.
I want to show that $\bar{X}-\bar{Y}$ is a minimax estimator for estimating $\Delta=\mu-\eta$ under the square loss.
I can show that the risk function of $\bar{X}-\bar{Y}$ is $R(\bar{X}-\bar{Y}, \mu-\eta)=\frac{\sigma^2}{n}+\frac{\tau^2}{m}$
If I can find the Bayes estimator $\delta_\Lambda$, so I can get the Bayes risk $r_\Lambda$. When the variance terms go to infinity if $r_\Lambda \rightarrow R(\bar{X}-\bar{Y}, \mu-\eta)$, then $\bar{X}-\bar{Y}$ is a minimax estimator by Theorem 1.12.
But I do not know how to derive the Bayes estimator $\delta_\Lambda$ for $\bar{X}-\bar{Y}$.
I know how to get it for $\underline{X}~N(\theta, \sigma^2)$, which is $$\delta_n=\frac{\frac{n}{\sigma^2}\bar{X}+\frac{\mu^*}{b^2}}{\frac{n}{\sigma^2}+\frac{1}{b^2}}$$ As prior distribution for $\theta$, the conjugate normal distribution $N(\mu^*, b^2)$.
So, the Bayes risk is $r_\Lambda = \frac{1}{\frac{n}{\sigma^2}+\frac{1}{b^2}}$. So, $r_\Lambda \overset{b\to\infty} \to \frac{\sigma^2}{n} = \underset{\theta}SUP R(\theta, \bar{X})$
Could anybody teach me how to get the Bayes risk for $\bar{X}-\bar{Y}$. Thanks!
{From Theory of Point Estimation, E.L. Lehmann p.343
Theorem 1.12: Suppose that $\{\Lambda_n\}$ is a sequence of prior distributions with Bayes risks rn satisfying $r_\Lambda \le r = \lim_{n\to\infty} r_{\Lambda_n}$, where $r_{\Lambda_n}=\int R(\theta, \delta_n)d\Lambda_n(\theta)$ is a Bayes risk under $\Lambda_n$, and that $\delta$ is an estimator for which $\underset{\theta}SUP R(\theta,\delta)=r$ Then (i) $\delta$ is minimax and (ii) the sequence $\{\Lambda_n\}$ is least favorable.}