How to show that for a binomial(n, p) distribution, the MLE X/n is admissible under square error loss? The Bayes rule undr square error loss with beta($\alpha, \beta$) prior is X+$\alpha$/ (n +$\alpha + \beta)$. Is there any way to show the two rules to be equivalent?
Asked
Active
Viewed 804 times
2
-
3To which "two rules" are you referring? If you mean the two *estimators* given by $X/n$ and $(X+\alpha)/(n+\alpha+\beta)$, then--given they are obviously different whenever $\alpha\ne 0$ or $\beta \ne 0$--then what do you mean by "equivalent"? – whuber Nov 17 '14 at 19:15
-
What I mean is that if I can show that the risks are same for the two rules then I can say that X/n is admissible since if two bayes rules are unique upto equivalence they are admissible – kris91 Nov 17 '14 at 19:17
-
The only way the Bayes $B(a,b)$ estimator of $p$ is useful is when constructing a sequence of proper Bayes estimators converging to the improper Bayes estimator. But it is not necessary here (see my answer). – Xi'an Nov 17 '14 at 20:54
1 Answers
4
The connection between admissibility and Bayes is that a Bayes estimator with a finite Bayes risk is admissible. The MLE $X/n$ is the Bayes estimator for the improper prior $\pi(p)=1/p(1-p)$ and hence the Bayes risk is \begin{align*} r(\pi) &= \frac{1}{n^2}\,\int_0^1 \mathbb{E}_p[(X-np)^2] \pi(p)\text{d}p\\ &= \frac{1}{n^2}\,\int_0^1 np(1-p)\frac{1}{p(1-p)}\,\text{d}p\\ &= \frac{1}{n} \end{align*} Since it is finite, $X/n$ is admissible. (See my book for more details!)

Xi'an
- 90,397
- 9
- 157
- 575