Is the inverse of the sample variance integrable? That is, does it hold that $$ E\bigg[\bigg(\frac{1}{n}\sum_{i=1}^n X_i^2 - \overline{X}_n^2\bigg)^{-1}\ \bigg] < \infty. $$
Asked
Active
Viewed 47 times
0
-
2It depends on the underlying distribution. For example, when the distribution is not continuous, the answer is no (because there will be a positive chance that the sample variance is zero), but when the $X_i$ are *iid* Normal and $n\gt 3,$ the expectation will be finite. What assumptions, then, are you making about the distribution and the sample size? – whuber Mar 17 '21 at 20:20
-
The sample size is at least $n=50$ and could be much much larger, e.g. $n=10,000+$. The $X_i$ are iid observations of real data so they are not normal. The distribution is continuous. Is it still possible that the inverse of the sample variance is integrable even though the $X_i$ are not normal? – ManUtdBloke Mar 17 '21 at 21:18
-
Assuming the probability density of $X_i$ is compactly supported would also be ok if that helps. – ManUtdBloke Mar 17 '21 at 21:52
-
1It is not only possible, it is likely. You can hope that the sampling distribution of the variance is approximately chi-squared and, even if it is not, you just need really tiny variances to be unlikely. The compact support isn't relevant: after all, as the support grows, the chance of observing a *large* variance ought to increase, which would *reduce* the expectation of its reciprocal. See https://stats.stackexchange.com/a/299765/919 for some of the intuition behind these statements. – whuber Mar 17 '21 at 21:58
-
Very good thanks. Regarding the linked post and in the context of the sampling distribution of the variance being approximately chi-squared: The chi-squared distribution has non-zero density in the interval $[0,\varepsilon)$ for an arbitrarily small epsilon when the dofs $k>1$. So there is some probability mass concentrated near zero, although it is small and gets smaller as $k$ increases. The linked post makes it seem that the expectation of the inverse of the chi-squared distribution would be infinite due to this small mass near zero, is this the case? – ManUtdBloke Mar 17 '21 at 22:27
-
1No, it is not. The chi-squared density grows vanishingly small near zero for all parameters greater than $1.$ The reciprocal of a chi-squared variable has an Inverse Gamma distribution--you can [check it has finite mean.](https://en.wikipedia.org/wiki/Inverse-gamma_distribution) – whuber Mar 17 '21 at 22:55
-
@whuber Not entirely, for $k=1$ and $k=2$ there is the steep decline of the density. But for $k>2$ it will vanish. – cherub Mar 18 '21 at 17:02
-
@cherub I cannot make sense of that assertion. If by "density" you mean the density of the $\Gamma(k)$ distribution and by "steep decline" you mean as the argument increases away from $0,$ that is true; but the density *never* vanishes anywhere on the positive real numbers for any $k.$ – whuber Mar 18 '21 at 17:30
-
@whuber Sorry, it was formulated wrong. The probability density for $k>2$ will go to zero for $x\rightarrow 0$. Whereas for $k=1,2$ it is larger than zero for $x\rightarrow 0$; the functional form for $x\rightarrow \infty$ is irrelevant. The only comment I wanted to make to your comment was that the number of parameters is not only greater than $1$, but also greater than $2$. – cherub Mar 18 '21 at 18:49
-
@Cherub, Thank you--now I get it. When I wrote "parameters greater than $1$" I was thinking in terms of the *Gamma* density, whose parameter is one-half the chi-square DF value. I apologize for the miscommunication. – whuber Mar 18 '21 at 18:51