0

Let $r$ be the observed number of successes in $n$ Bernoulli trials with probability $\pi$ of success. Then M.V.U.E (Minimum Variance Unbiased Estimator) of $\pi (1-\pi)$ is ?

$n$ Bernoulli trials can be considered to follow Binomial Distribution with parameters ($n$,$\pi$).

I understand that an efficient estimator of a parameter is always an Uniform minimum Variance Unbiased Estimator. And an unbiased estimator is called an efficient estimator if it satisfies Cramer-Rao lower bound.

So is finding the Cramer-Rao lower bound the way to solve this question? And if so how is it done because I am struggling in conjuring the Likelihood Function

EDIT : As mentioned by @wuber , I went through a few prevously answered questions(Maximum Likelihood Estimation for Bernoulli distribution) pertaining to finding the Maximum Likelihood Estimator of $n$ Bernoulli Trials and this is what I got:

Let $x_{i}$ be the $i^{th}$ success such that $r = \sum_{i=1}^n x_i$

$$ \begin{align*} L(\pi) &= \prod_{i=1}^n \pi^{x_i}(1-\pi)^{(1-x_i)}\\ \ell(p) &= \log{\pi}\sum_{i=1}^n x_i + \log{(1-\pi)}\sum_{i=1}^n (1-x_i)\\ \dfrac{\partial\ell(\pi)}{\partial \pi} &= \dfrac{\sum_{i=1}^n x_i}{\pi} - \dfrac{\sum_{i=1}^n (1-x_i)}{1-\pi} \overset{\text{set}}{=}0\\ \sum_{i=1}^n x_i - \pi\sum_{i=1}^n x_i &= \pi\sum_{i=1}^n (1-x_i)\\ \pi& = \dfrac{1}{n}\sum_{i=1}^n x_i\\ \pi& = r/n\\ \dfrac{\partial^2 \ell(\pi)}{\partial \pi^2} &= \dfrac{-\sum_{i=1}^n x_i}{\pi^2} - \dfrac{\sum_{i=1}^n (1-x_i)}{(1-\pi)^2} \end{align*} $$

The Maximum Likelihood estimator of $\pi = \frac{r}{n}$

Therefore by Invariance Property $$ Estimator \ of \ \pi (1-\pi) = \frac{r}{n}(1-\frac{r}{n})$$

But back to the question at hand, can we deduce this as M.V.U.E ?

Kalvin
  • 423
  • 6
  • 3
    Your basic question--what is the Bernoulli likelihood--is answered many times here on CV. Search https://stats.stackexchange.com/search?q=bernoulli+likelihood. – whuber Sep 10 '21 at 18:03
  • @wuber Thank you for pointing that out. – Kalvin Sep 11 '21 at 04:59
  • I have made an edit, let me know if it is correct. – Kalvin Sep 11 '21 at 05:22
  • 1
    This has been asked and answered before, like in https://stats.stackexchange.com/q/292255/119261, https://stats.stackexchange.com/q/143962/119261, https://stats.stackexchange.com/q/255250/119261, https://stats.stackexchange.com/q/410923/119261 – StubbornAtom Sep 11 '21 at 06:22
  • @Xi'an Yes thats true. I am preparing for an examination and I've needed help in some areas. I hope I am not violating any rules. – Kalvin Sep 11 '21 at 09:24
  • 1
    The issue imho is more that, by repeatedly asking rather similar questions, you hint at the fact that the previous Q&A did not help you in the long run and at a deeper level than just solving the exercise. – Xi'an Sep 11 '21 at 10:11
  • 1
    @Xi'an I understand what you are saying. I'll keep that in mind in the future and not resort to asking similar questions. – Kalvin Sep 11 '21 at 10:13
  • 1
    Your course is apparently ignoring Bayesian statistics where estimators are downplayed in favor of deriving entire distributions for parameters we don't know. – Frank Harrell Sep 11 '21 at 11:46

0 Answers0