0

I am working on some sample questions, and I came across one I have no clue how to answer.

How do Standard error of $\widehat\theta_1$, $\widehat\theta_2$, $\widehat\theta_1 + \widehat\theta_2$, relate to each other.

I know that the standard error is kind of the standard deviation of an estimator... I cant find any info in my book that relies to this question, or at least I can’t see the connection...

So i know that the estimator $\widehat\theta_i$ has a distribution and the stand. Dev of this distribution is called standard error. So the $se=\sqrt{var} $ of the distribuition of $\theta$

So I would answer the question like this

$$\sqrt{var(\widehat\theta_1 + \widehat\theta_2)}=\sqrt{var(\widehat\theta_1)+var(\widehat\theta_2)+ 2cov(\widehat\theta_1,\widehat\theta_2)}$$

Not sure if this is correct, and it does not tell me how it relates to se of $\theta_1$ and $\theta_2$

Lillys
  • 55
  • 6
  • 1
    Does your book (a) explain that the standard deviation is the square root of the variance and (b) provide some simple rules for computing variances of sums of random variables? (If it doesn't, then this question would be inappropriate for your book anyway.) – whuber Dec 27 '18 at 18:32
  • 2
    Please add the `[self-study]` tag & read its [wiki](https://stats.stackexchange.com/tags/self-study/info). Then tell us what you understand thus far, what you've tried & where you're stuck. We'll provide hints to help you get unstuck. – gung - Reinstate Monica Dec 27 '18 at 18:35
  • @whuber the sum of two Norm. Dist. Rv is also normally distributed, and var(X+Y)=var(X)+var(Y), but how does this relate to standard error? Sorry if my questions sound stupid – Lillys Dec 27 '18 at 18:41
  • @gung not that much, I know the definition of standard error, as the standard deviation of an estimator. I read something of the mean standard error, but that is not the sum of two estimators. I know the definiton of a consistent estimator, that with n going to infinity the estimator goes to the parameter. (Sorry for my English) – Lillys Dec 27 '18 at 18:45
  • Please read our threads about covariances and variances. Some of the first hits on a search are https://stats.stackexchange.com/questions/149656, https://stats.stackexchange.com/questions/261872, and https://stats.stackexchange.com/questions/189408. This question is only about basic properties of variances--it is not specific to Normally distributed variables, estimators, limits, or anything that advanced. If that's not clear, then take it as a reminder to review the definition and concept of the standard error. – whuber Dec 27 '18 at 18:45
  • @whuber many thanks, i will read those threads, could you give me a link on how those articles correspond to standard error, i think I’m missing something here.... – Lillys Dec 27 '18 at 18:48
  • By definition, the standard error is the square root of the variance of the estimator. – whuber Dec 27 '18 at 18:50
  • 1
    Can we always use the var(a+b) = var(a) + var(b) + 2 cov(a,b) formula when those three terms on the right hand side are estimates? – Sextus Empiricus Dec 27 '18 at 18:59
  • @MartijnWeterings i would guess yes bc the distribution of estimators is "just" a distribution as any other....? – Lillys Dec 27 '18 at 19:04
  • 2
    I don't think this question is "stupid". Please add the `[self-study]` tag & add your understanding from your comments to the body of the question. – gung - Reinstate Monica Dec 27 '18 at 19:36
  • @gung thanks I added the self study tag and what I know so far to the question – Lillys Dec 27 '18 at 20:56
  • If you were to take your final formula (which looks good, btw) and replace each instance of "$\operatorname{Var}(\cdot)$" by "$\operatorname{se}(\cdot)^2$" with $\cdot$ replaced by $\hat\theta_1,$ $\hat\theta_2,$ and $\hat\theta_1+\hat\theta_2,$ you would have the relationship you seek. In general, you cannot get rid of the covariance term. – whuber Dec 27 '18 at 21:04
  • Answering my own question. The formula should be correct (in the sense that at least it is unbiased). $$E( \hat\sigma_{11}^2 + \hat\sigma_{22}^2+ 2 \hat\sigma_{12}^2) = E(\hat\sigma_{11}^2) + E(\hat\sigma_{22}^2) + E(2 \hat\sigma_{12}^2 )) = \sigma_{11}^2 + \sigma_{22}^2 + 2\sigma_{12}^2 = Var (a+b) $$ – Sextus Empiricus Dec 27 '18 at 22:42
  • I don't see the `[self-study]` tag on your question. Not to be a pest, here, but please add it & read its [wiki](https://stats.stackexchange.com/tags/self-study/info); this is our policy. When you click 'edit', scroll down below the text block & you'll see a 1-line field where you can enter tags. Right now, you only have `[standard-error]`. – gung - Reinstate Monica Dec 28 '18 at 02:59

1 Answers1

0

Some hints: Here are some special cases for you to consider. Suppose you have a random sample $X_1, \dots, X_8$ (so that $n=8),$ from a population with unknown mean $\mu$ (to be estimated) and known standard deviation $\sigma > 0.$

Two biased, independent estimators.

Let $\hat \mu_1 = (X_1 + X_2 + X_3 + X_4)/n,$ so that $E(\hat \mu_1) = \mu/2,$ variance $V(\hat \mu_1) = \frac{1}{64}(4\sigma^2) = \sigma^2/16,$ and $SE(\hat \mu_1) = \sigma/4.$

Notice that $E(\hat \mu_1) = \mu/2 \ne \mu,$ so that $\hat \mu_1$ is a seriously biased estimator of $\mu.$ But your Question says "estimator" not "good estimator."

Similarly, let $\hat \mu_2 = (X_5 + X_6 + X_7 + X_8)/n,$ so that $E(\hat \mu_2) = \mu/2,$ variance $V(\hat \mu_2) = \frac{1}{64}(4\sigma^2) = \sigma^2/16,$ and $SE(\hat \mu_2) = \sigma/4.$

Notice that the estimators $\hat \mu_1$ and $\hat \mu_2$ are independent because they use different elements of the random sample of size $n = 8.$ Then $$V(\hat \mu_1 + \hat \mu_2) = V(\hat \mu_1) + V(\hat \mu_2) = \sigma^2/8$$ and $SE(\hat \mu_1 + \hat \mu_2) = \sigma/\sqrt{8} > \sigma/4.$ Thus the SE of the sum is larger than the SE of the SE's of $\hat \mu_1$ and $\hat \mu_2.$

Nevertheless, $\bar X = \hat \mu_1 + \hat \mu_2$ is considered to be a better estimator of $\mu$ than either $\hat \mu_1$ or $\hat \mu_2,$ partly because it is unbiased, $E(\bar X) = \mu.$

Two estimators that are not independent.

Based on the same data, let $\hat \mu_3 = \bar X$ and $\hat \mu_4 = -\bar X.$ These estimators are obviously not independent. Also, for many distributions, $\bar \mu_4$ is an undesirable estimator, but an estimator nevertheless. I will leave the details of finding $SE(\hat \mu_3),\; SE(\hat \mu_4)$ and $SE(\hat \mu_3 + \hat \mu_4)$ to you. But you will find that that the third SE is smallest.

BruceET
  • 47,896
  • 2
  • 28
  • 76