5

suppose $X_1,X_2,\ldots,X_n$ be a random sample of $N(\theta,1), \theta>2$. how can I show $\bar{X}$ is inadmissible estimator Compared to $\max(\bar{X},2)$ under Squared error loss function

user603
  • 21,225
  • 3
  • 71
  • 135
marzieh
  • 109
  • 3
  • 1
    One approach is to note that for the second estimator the losses that would be attached to estimates less than $2$ are limited by the loss at $\bar X = 2$. You can then reason that for any value of $\theta \gt 2$ the loss function for $\bar X$ will strictly dominate that for $\max(\bar X, 2)$, whence the expectation will be larger, leading to a dominated risk. I recommend plotting some of these loss functions and the ensuing risk functions to help you visualize this argument. – whuber Apr 01 '15 at 20:29
  • Related: https://stats.stackexchange.com/questions/347403/showing-mse-of-barx-mathbf1-barx0-is-less-than-that-of-bar-x-when?noredirect=1&lq=1, https://stats.stackexchange.com/questions/58626/how-to-show-that-e-hat-theta-theta2var-bar-x-dfrac1n?noredirect=1&lq=1. – StubbornAtom Oct 30 '19 at 20:18

1 Answers1

1

$%preamble \newcommand\Var{\mathrm{Var}} \newcommand\E{\mathrm{E}} \newcommand\Bias{\mathrm{Bias}} \newcommand\htheta{\hat\theta} $ Let X be a (integrable) random variable and $c$ a constant, then flesh out the derivation

$$\E[(X-c)^2] = \E[((X-\E[x])-(c-\E[x]))^2] = \Var(X)+(\E[x]-c)^2$$

Then, in particular, reason and memorize that if $\hat\theta$ is an estimator for $\theta$ then

$$\E[(\htheta-\theta)^2] = \Var[\htheta] + \Bias(\htheta,\theta)^2$$

In the case of $\htheta_1 = \bar{X}$, determine $\Var[{\htheta_1}]$ as a function of $n$ and verify $\Bias(\htheta_1,\theta)^2 = 0$.

It should be somewhat intuitive that estimator $\htheta_2 = \max(2,\htheta_1)$ is biased but has lower variance, but not immediately clear that the total MSE is lower.


Recall the important rules of total expectation and total variance, that if X and Y are random variables (with finite expectation and variance) then

$$\E[X] = \E[\E[X|Y]] $$

$$\Var[X] = \Var[\E[X|Y]] + \E[\Var[X|Y]]$$


Let $I$ a subset of the probability space (an event). Then, using the above rules and an indicator random variable, see that

$$\E[X] = \E[X|I] P[I] + \E[X|I^c] P[I^c]$$

$$\Var[X] = (\Var[X|I] P[I] + \Var[X|I^c] P[I^c]) + (\E[X|I]-\E[X|I^c])^2 P[I]P[I^c]$$


In particular, let $I = (2,\infty)$, and use the above for $\htheta_1$ and $\htheta_2$ to determine expressions for their variance and squared bias in terms conditioned on $I$ and $I^c$. Since both estimators are the same conditional on $I$, you should see opportunity to make the necessary inequality.

A. Webb
  • 690
  • 5
  • 9