3

I was reading the section on k-statistics on wolfram alpha. It was known to me that for the sample variance

$k_2 = \frac{1}{n-1}\sum_{i=1}^n (x_i - \overline{x})^2$

it holds that its variance equals

$var(k_2) = \frac{\kappa_4}{n} + \frac{2 \kappa_2}{n-1} = \frac{\mu_4}{n} - \frac{\sigma^4(n-3)}{n(n-1)},$

where $\kappa_i$ denotes the $i$-th cumulant, $\mu_4$ the 4-th central moment and $\sigma^2$ the variance.

Now, apparently there exists an unbiased estimator for $var(k_2)$ given by

$\hat{var}(k_2) = \frac{2n k_2^2 + (n-1)k_4}{n(n+1)},$

where

$k_4 = \frac{n^2}{(n-1)(n-2)(n-3)}\left( (n+1) m_4 - 3(n-1) m_2^2 \right)$.

Here $m_p = \frac{1}{n} \sum_{i=1}^n (x_i - \overline{x})^p$.

There reference given is Kenney and Keeping 1951, p. 189. However, I cannot find a copy anywhere, or a derivation of this equation.

Can anyone help me with this derivation or point me towards a reference?

Also, I was wondering if a similar equation would provide an unbiased estimator for the variance of the sample covariance.

Akkariz
  • 71
  • 4
  • 1
    Just compute expectations. It's purely algebra; there's no statistical or probabilistic magic going on. – whuber May 31 '16 at 14:41
  • What about the existence of an unbiased estimator for the variance of the sample covariance? I can't find anything about this. – Akkariz May 31 '16 at 14:48
  • 1
    It's the same algebra, just a little more finicky--especially once you realize the sample covariance is [just a variance in disguise](http://stats.stackexchange.com/a/142472/919). [Kendall & Stuart volume I](http://www.amazon.com/Kendalls-Advanced-Theory-Statistics-Distribution/dp/0470665300) contains a comprehensive account: see Chapters 12 (univariate case) and 13 (multivariate case) on ["k-statistics."](http://mathworld.wolfram.com/k-Statistic.html) – whuber May 31 '16 at 14:53

0 Answers0