2

For now, assume square loss. Let's estimate some parameter $\theta$, such as $\theta = \mu$ in $N(\mu, 1)$.

Is there ever a case where there is no such $c$ to make $\hat{\theta} = c$ an admissible estimator of $\theta$? (Bonus question: What about if we relax the assumption of square loss?)

I lean toward no, at least for square loss. If we decompose mean squared error, we get the bias-squared and variance of $\hat{\theta}$.

$$ MSE(\hat{\theta}) = \mathbb{E}\big[(\hat{\theta} - \theta)\big]^2 + var(\hat{\theta}) $$

For the constant $c$, the variance is zero, and the bias is zero when $\theta = c$, perhaps a ridiculous notion in applied statistics but completely legitimate in mathematical statistics. Thus, $MSE(\hat{\theta}) = 0$.

I struggle to see how any other estimator could match $\hat{\theta} = c$ when $c = \theta$.

Dave
  • 28,473
  • 4
  • 52
  • 104
  • 1
    What definition of "admissible" are you considering? The one I know does not allow a single equality like the one you exhibit to prevent inadmissibility. – whuber Jun 08 '21 at 20:53
  • @whuber I would say that the risk function (so $MSE$ for square loss) has to be dominated ($\le$ everywhere, $ – Dave Jun 08 '21 at 20:57
  • That's my understanding. Your constant estimator is not necessarily admissible, then. – whuber Jun 08 '21 at 21:00
  • @whuber What would be an example? – Dave Jun 08 '21 at 21:05
  • 1
    Take$$X\sim\mathcal B(p)$$with $p\in\{0,1\}$ and $\delta(x)=x$. Then $\delta$ dominates both constant estimators. – Xi'an Jun 09 '21 at 07:36
  • @Xi’an Why is the $p$ of the Bernoulli restricted to $0$ and $1$? – Dave Jun 09 '21 at 10:13
  • 1
    Okay, so we assume that it is a heads-heads or tails-tails coin, just not which. If we guess whichever face comes up after the flip, that is a better way of guessing than always saying it is heads or tails, regardless of the outcome of the flip. – Dave Jun 09 '21 at 10:40
  • Another (artificial) counter example is observing $X∼U(θ−1/4,θ+1/4)$ when $θ∈\mathbb N$. In this case, $δ(x)=⌊x⌋$ has a risk that is uniformly equal to zero. – Xi'an Jun 10 '21 at 07:08

1 Answers1

2

EXAMPLE OF INADMISSIBLE CONSTANT ESTIMATOR

Assume a heads-heads or tails-tails coin; we just do not know which. If we guess whichever face comes up after the flip, that is a better way of guessing than always saying it is heads (or tails) regardless of the outcome of the flip.

Let $X\sim Bernoulli(p)$, with $p\in\{0, 1\}$. That is, we flip a coin that we know is heads-heads or tails-tails; we just do not know which. Let's estimate $p$. (That is, let's determine if the coin is heads-heads or tails-tails.)

My proposed admissible constant estimator would be $\delta_0(X) = 0$ or $\delta_1(X) = 1$. Both of these have $MSE = 0$ when $p = 0$ and $p = 1$, respectively, but each has $MSE = 1$ when $p=1$ and $p=0$, respectively.

However, $\delta(X) = X$ is another estimator. That is, look at the coin. If the coin comes up heads ($1$), the coin is heads-heads. If the coin comes up tails ($0$), the coin is tails-tails.

$$bias(\delta) = 0 \text{ }\text{ }\text{ } \forall \text{ }\text{ }\text{ } p\in\{0, 1\}$$

$$ var(\delta) = p(1-p) = 0 \text{ }\text{ }\text{ } \forall \text{ }\text{ }\text{ } p\in\{0, 1\}$$

$$MSE(\delta) = (bias(\delta))^2 + var(\delta) = 0 \text{ }\text{ }\text{ } \forall \text{ }\text{ }\text{ } p\in\{0, 1\}$$

Thus, $\delta$ beats $\delta_0$ when $p=1$ and beats $\delta_1$ when $p=0$, and $\delta$ ties $\delta_0$ when $p=0$ and beats $\delta_1$ when $p=1$.

Since we restrict $p$ to $p\in\{0, 1\}$, $\delta_c(X) = c$ will have bias for any $c$ but $c\in\{0,1\}$, so those lose to $\delta$, despite their zero variance.

$\delta$ achieves the interesting feat of being unbiased and having zero variance for all $p\in\{0,1\}$. The constant estimators all have bias for some $p\in\{0,1\}$. Therefore, there is no $c$ such that $\delta_c(X) = c$ is admissible.

I liked what Xi'an posted in the comment and wanted to expand on it in a self-answer.

Dave
  • 28,473
  • 4
  • 52
  • 104