3

Suppose I've found that the Bayes risk is of the form $$r(\theta) = \int_{-a}^a \theta^2 \pi(\theta)d\theta $$

I want to show that the following distribution, $\pi(a)=\pi(-a)=0.5$, maximizes this quantity, i.e. that it's a least favorable prior distribution.

This seems intuitively right to me, as any distribution over a range of values, will surely reduce the variance, compared to a distribution that allocates only the extreme values - but I'm not sure how to prove this rigorously.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Maverick Meerkat
  • 2,147
  • 14
  • 27

0 Answers0