Suppose we have densities $f_1, f_2$ from the random variables $W_1$ and $W_2$ where $W_i$ has known mean $\mu_i$ and variance $\sigma_i$. Consider the mixture of the two densities $$ f(x;\theta)=\theta f_1(x) + (1-\theta)f_2(x)$$ with $0< \theta < 1$ the unknown mixing proportion. A random sample $X_1, ..., X_n$ from $X \sim f(x;\theta)$ is available.
I calculated the method of moment estimator to be $$\hat{\theta}_n = \frac{\bar{X_n}-\mu_2}{\mu_1-\mu_2}.$$ I was also able to prove that this estimator is unbiased.
I'm now stuck on how to find the variance, MSE and the asymptotic normality result of this estimator.
$\textbf{EDIT : }$ I think I was able to calculate the variance and the MSE (which is equal to the variance since $\hat{\theta}_n$ is unbiased). $$Var(\hat{\theta}_n) = \left(\frac{1}{n(\mu_1 - \mu_2)^2}\right) \sum^n_{i=1}(\theta^2 \sigma^2_1 + (1-\theta)^2 \sigma^2_2).$$ But I'm not sure about this calculation.
I also got the Fisher Information $$I(\theta) = \left(\frac{1}{\theta} + \frac{1}{1 - \theta}\right)^2$$ which I'm also not too sure about. But this could easily give me the asymptotic normality result.
I also have a second question: Consider $\sigma=\sigma_1=\sigma_2$ and denote $\Delta = \mu_1 - \mu_2$ and assume $\Delta > 0$ How could one find an approximation for the probability that the method of moment estimator $\hat{\theta}_n$ differs more from $\theta$ than a given amount $b > 0$?