We are constructing an estimator out of two independent samples Sample 1: $\hat\mu_1$ $var(\hat\mu_1 )=\sigma_1^2$ and sample 2: $\hat\mu_2$, $var(\hat\mu_2)=\sigma_2^2$
$$\hat\mu_3=a*\hat\mu_1+b\hat\mu_2$$ where a+b=1
I already know that for the $\hat\mu_3$ to be unbiased a+b must equal 1, and in order to minimize the variance a and be must be:
$$a=\frac{\sigma_2^2}{\sigma_1^2+\sigma_2^2}$$ $$b=\frac{\sigma_1^2}{\sigma_1^2+\sigma_2^2}$$
Is the so found variance of $\hat\mu_3$ small then $\sigma_1^2$ and/ or $\sigma_2^2$?
I am quite sure that the criteria for a and be are correct, compared with a solution. But i have no idea to show it the variance of the new estimator is smaller then (and/or!) the variance of the individual estimators??
I tried to plug in for and b, but there is nothing obvious to me which would give me a clue how to answer this question.