0

Let $\{X_i : i = 1, 2, \dots ,n\}$ be independent random variables with finite second moments. They are not necessarily identically distributed. They do have the same mean,$\mu=E(X_i)$ for all i, but possibly different variances, $\sigma^2_i > 0$. Assume these variances are known. Let $p$ be the estimator that solves the problem $\min \Sigma[(X_i-m)^2/\sigma^2_i]$ with respect to $m$. Each squared term is weighted by the inverse of the variance. Show that $p = \Sigma(w_i X_i)$ for weights $w_i > 0$ such that $\Sigma w_i = 1$.

I know we have must find the weights. I get the weights as follows: $w_i = {(\Sigma[1/\sigma^2_i])}^{-1} / \sigma^2_i$.

Is this right?

Thanks!

jbowman
  • 31,550
  • 8
  • 54
  • 107
  • Please see https://stats.stackexchange.com/questions/243922/how-to-estimate-population-variance-from-multiple-samples/246537#246537. – whuber Nov 07 '18 at 20:00
  • I did see. While it is similar, the other question relates to estimate of variance. I just want to know whether my answer is right – Aishwarya Deore Nov 08 '18 at 14:54
  • Also..,how would you prove that these weights are optimal. I know we have to set up the langrangian and differentiate wrt to each wi. But I am stuck at the simplification part – Aishwarya Deore Nov 09 '18 at 15:32
  • You can use elementary inequalities to demonstrate optimality, or even Euclidean geometry. I give the geometric argument at https://stats.stackexchange.com/a/9073/919. – whuber Nov 09 '18 at 15:46

0 Answers0