Questions tagged [admissibility]

Admissible estimator: there is no other estimator for which the risk is $\leq$ for all possible true values of the target parameter.

In statistical decision theory, an admissible decision rule is a rule for making a decision such that there is no other rule that is always "better" than it (or at least sometimes better and never worse). For an admissible estimator, there is no other estimator for which the risk (i.e. expected loss over the sampling distribution of the data) is equal or smaller for all possible true values of the target parameter.

Source: Wikipedia.

19 questions
18
votes
0 answers

Empirical Bayes (In)Admissibility

Most of the time, sticking to a pure Bayesian approach to statistics with proper priors, leads to admissible estimators. Nevertheless, there is a good reason to use Empirical Bayes in many cases, and the frequentists are enjoying better accuracy…
Cagdas Ozgenc
  • 3,716
  • 2
  • 29
  • 55
13
votes
1 answer

Is $\frac1{n+1}\sum_{i=1}^n(X_i-\overline X)^2$ an admissible estimator for $\sigma^2$?

Consider a sample $X_1,X_2,\ldots,X_n$ from a univariate $N(\mu,\sigma^2)$ distribution where $\mu,\sigma^2$ are both unknown. Then it is known that under squared error loss, the sample variance $s^2=\frac1{n-1}\sum\limits_{i=1}^n (X_i-\overline…
StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
8
votes
1 answer

Model with admissible estimator(s) that are not the Bayes estimator for any choice of prior?

Every Bayes estimator is admissible, to the best of my knowledge. (Related questions - 1,2.) I recall my professor mentioning once during a lecture that, at least as rough intuition, the converse is true as well, that is, every admissible estimator…
Chill2Macht
  • 5,639
  • 4
  • 25
  • 51
8
votes
0 answers

Why is Wald's decision theory not universally recognized as the foundation of statistics?

This is somewhat ill-defined, but: Why is Wald's decision theory not universally recognized as the foundation of statistics? I gather (or maybe I infer) that it was formulated to put frequentist and Bayesian methods (or any other kind of methods)…
Adam L. Taylor
  • 531
  • 2
  • 4
5
votes
1 answer

Admissible Empirical Bayes Examples

I would like to hear about a few simple empirical bayes estimators that are admissible for high (i.e. at least 3) dimensional parameter space. What are some textbook lollipop examples to study for beginners with easy derivations?
5
votes
0 answers

Is an inadmissible estimator necessarily dominated by some admissible estimator

Basic example: $X$ has a $p$-variate iid standard Normal distribution; the sample mean is not admissible if $p>2$ and is dominated by the Stein shrinkage estimator. However, the Stein shrinkage estimator is also not admissible, and is dominated by…
5
votes
1 answer

Admissibility of Bayes estimators

I have the following questions. They are not homework problems, but they are things that the professor said that I should wonder about. I suspect that I will have to deal with this on an exam in the future. So my questions are: Is a limiting Bayes…
shmiggens
  • 215
  • 1
  • 5
5
votes
1 answer

showing that $\bar{X}$ is inadmissible by comparing with $\max(\bar{X},2)$ under squared error loss function

suppose $X_1,X_2,\ldots,X_n$ be a random sample of $N(\theta,1), \theta>2$. how can I show $\bar{X}$ is inadmissible estimator Compared to $\max(\bar{X},2)$ under Squared error loss function
4
votes
1 answer

How to choose loss function (in unbounded parameter space)?

How does one choose a loss function for a given problem? (I've looked through stackexchange, and I haven't been able to find a thread that discusses this.) Let say I observe some data $x \in \mathbb{R}^n$, and I'm interested in estimating some…
4
votes
2 answers

Admissibility under the loss function

Suppose $X_1 , ..., X_n$ are random samples of exponential distribution with mean $\theta$. Determine $a$ and $b$ such that $a\sum_{i=1}^n X_i +b$ be admissible under the loss function $L(\theta,\delta)=\frac{(\delta - \theta)^2}{\theta ^2}$. All my…
4
votes
1 answer

On the proof of admissibility of constant estimators under squared loss

The question concerns the discussion in Wasserman, All of Statistics, Section 13.6. He defines: An estimator $\hat{\theta}$ is inadmissible if there exists another rule $\hat{\theta}'$ such that $$R(\theta, \hat{\theta}') \leq R(\theta,…
Christoph Hanck
  • 25,948
  • 3
  • 57
  • 106
3
votes
2 answers

Admissible Bayes Rule

In the following wikipedia entry https://en.wikipedia.org/wiki/Admissible_decision_rule it is written that "Bayes rules with respect to proper priors are virtually always admissible" What do they mean by "virtually always"? I know that it needs to…
Cagdas Ozgenc
  • 3,716
  • 2
  • 29
  • 55
3
votes
1 answer

Why is the MLE/OLS estimator so common in regression despite inadmissibility?

Why is regression so commonly used if the OLS estimator for the vector of regression coefficients is inadmissible under the squared error loss function? Is it because of its historical popularity or the ease of computation or some other practical…
frelk
  • 1,117
  • 1
  • 8
  • 19
3
votes
2 answers

Admissible Estimator for Linear Regression

Is there an admissible estimator for a linear regression model with many parameters without restricting the parameter space? Admissibility will be with respect to Mean Square Error on the regression parameter vector. It seems that James-Stein…
2
votes
1 answer

Is a constant ever inadmissible?

For now, assume square loss. Let's estimate some parameter $\theta$, such as $\theta = \mu$ in $N(\mu, 1)$. Is there ever a case where there is no such $c$ to make $\hat{\theta} = c$ an admissible estimator of $\theta$? (Bonus question: What about…
Dave
  • 28,473
  • 4
  • 52
  • 104
1
2