2

Take the random sample $X_1, \dots, X_n$ with mean $\mu$ and variance $\sigma^2 < \infty$. Now assume the $X_i$ are Poisson random variables with parameter $\lambda$. I am told that the Lehmann-Scheffè theorem directly implies the identity $E[S^2 \mid \bar{X}] = \bar{X}$, where $S^2 = \dfrac{1}{n - 1} \sum_{i = 1}^n \left( X_i - \bar{X} \right)^2$ and $\bar{X} = \dfrac{1}{n} \sum_{i = 1}^n X_i$. And when I say 'directly', I mean in the sense that there is no derivation necessary (I already know that it can be derived, so that's not the part I'm interested in). I would like to see how this is so.

I am currently studying the textbook All of Statistics: A Concise Course in Statistical Inference by Larry Wasserman. From what I can tell, there (surprisingly!) is no mention of Lehmann-Scheffè in this textbook (I also checked the index).

Furthermore, the Wikipedia page for Lehmann-Scheffè also isn't clear on this.

So how exactly does the Lehmann-Scheffè theorem directly imply the identity $E[S^2 \mid \bar{X}] = \bar{X}$?

The Pointer
  • 1,064
  • 13
  • 35
  • 1
    This is related to your other https://stats.stackexchange.com/questions/519714/trying-to-make-sense-of-claims-regarding-rao-blackwell-and-lehmann-scheff%c3%a9-for-s?rq=1, but there at least you specify that the $X_i$ are Poisson. – kjetil b halvorsen Apr 21 '21 at 11:49

3 Answers3

3

Here is the Wikipedia version of the

Lehmann-Scheffè Theorem Let $\vec{X}= (X_1, X_2, \dots, X_n$) be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case) $f(x:\theta)$ where $\theta \in \Omega$ is a parameter in the parameter space. Suppose $Y = u(\vec{X})$ is a sufficient statistic for $\theta$, and let$$\{ f_Y(y:\theta): \theta \in \Omega\}$$be a complete family. If $\varphi$ is such that $$\operatorname{E}[\varphi(Y)] = \theta$$ then $\varphi(Y)$ is the unique MVUE of $\theta$.

The elements of relevance to consider when applying this theorem to the setting of the question are

  1. $S^2(\vec{X})$ is an unbiased estimator of $\text{var}(X_i)=\lambda$.
  2. $\bar X(\vec{X})$ is an unbiased estimator of $\operatorname{E}[X_i]=\lambda$.
  3. $\bar X(\vec{X})$ is a sufficient and complete statistic.
  4. $\operatorname{E}[S^2(\vec{X})|\bar X(\vec{X})]$ is both an unbiased estimator of $\operatorname{E}[X_i]=\lambda$ and a function of $\bar X(\vec{X})$.

Conclusion follows with no further computation.

Xi'an
  • 90,397
  • 9
  • 157
  • 575
1

According to the LS theorem both the statistic $E(S^2|\bar{X})$ and $E(\bar{X}|\bar{X})$ are the unique UMVUE, so they must be equal.

It is trivial that the latter case is $E(\bar{X}|\bar{X})= \bar{X}$ and so also $$E(S^2|\bar{X}) = E(\bar{X}|\bar{X})= \bar{X}$$

This works generally when the complete sufficient statistic on which you condition is also an unbiased statistic. (Which might not always be the case when you apply the LS theorem)

Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161
  • Hi Martijn. I have been trying to contact you via email/LinkedIn. Can you please contact me by email to confirm that you received our draft paper. Also, do you have a reliable email I can contact you on? – Ben Apr 27 '21 at 23:14
-1

Since umvue is unique hence so the first thing is just umvue by Lehmann scheffe and Next one is the natural umvue of the parameter and by uniqueness theorem they should be equal