0

For simultaneous estimation of several parameters, the combination of LSE estimators for each parameter is usually not admissible under squared error for the vector of the parameters.

For example, when estimating the mean vector $\theta$ of a multivariate Gaussian distribution $ N({\boldsymbol \theta}, \sigma^2 I)$ with known $\sigma^2$, the combination of LSEs for each dimension of $\theta$ is $X$, and James–Stein's estimator dominates $X$ when the dimension of $\theta$ is greater than $2$, under the squared error.

Is James–Stein's estimator admissible? If not,

  • what is the LSE in multivariate?

  • what is an admissible estimator under the squared error for the mean of a multivariate Gaussian distribution?

Thanks and regards!

Tim
  • 1
  • 29
  • 102
  • 189
  • The Wikipedia article you reference states "It follows that the basic James–Stein estimator is itself inadmissible." Your second question is a duplicate. – whuber Jun 19 '13 at 14:45
  • Thanks, @whuber. The post you linked to is univariate, while here my case is multivariate. The LSE is admissible in univarate normal case, but in the multivariate case, the combination of LSE for each dim isn't LSE in the multivariate case. My question is what the LSE in multivariate is? What is an admissible estimator in multivariate case? So they are different from the univariate case in your linked post. – Tim Jun 19 '13 at 14:56
  • I have edited my post to remove some ambiguity. – Tim Jun 19 '13 at 15:04
  • The constant is admissible in the multivariate case, by the same arguments given in the duplicate. – whuber Jun 19 '13 at 16:45
  • What is the LSE estimator in the multivariate case then ? – Tim Jun 19 '13 at 16:49
  • It is given early in the Wikipedia article you link to. – whuber Jun 19 '13 at 16:50
  • $X$ is just combination of the LSE estimators of each dimension. It is not the LSE under the squared error in the multivariate case, is it? – Tim Jun 19 '13 at 16:52

0 Answers0