It is not that easy to find estimators $T_n$ such that $\mbox{Var}[T_n] \sim O(n^{-B})$ with $B = 2$. In most cases, $B=1$.Here $n$ is the sample size. It seems, according to this paper on U-statistics, that such estimators do exist. Is there a simple example?
Asked
Active
Viewed 104 times
2
-
1Can you show where in the lecture it is stated that $Var[T_n] \sim O(n^{-2})$? I can only find examples where they state that the variance of a given estimator is $O(n^{-1} + n^{-2})$, which can be simplified down to $O(n^{-1})$. – Cliff AB May 18 '19 at 18:34
-
The last sentence just above Example 3.11 (see bottom of page 3) suggests that this could be the case. – Vincent Granville May 18 '19 at 18:43
-
1Yes, you just have to find a degenerate kernel of the correct order (corollary 3.2 part iii in the notes you linked) – aleshing May 18 '19 at 20:48
1 Answers
0
Consider $\small G_n = \Big(\frac{2}{n(n-1)}\cdot \sum_{i=1}^n\sum_{j=i+1}^n |X_i-X_j|^p\Big)^{1/p}$. Let $p$ tends to infinity, then $G_n$ is the range. If the observations are independently and uniformly distributed on $[0, 1]$, then the range has a $\mbox{Beta}(n-1, 2)$ distribution, thus its variance is $\frac{2(n-1)}{(n+1)^2 (n+2)}$ and $B = 2$.

Vincent Granville
- 633
- 2
- 15
-
2The sample range asymptotically depends on the behavior of the distribution at its extremes. This is a special case of estimating a percentile of a distribution. Estimators based on order statistics, like your $G_n,$ exhibit myriad forms of asymptotic variance: they many converge with polynomial order; they may converge faster than any polynomial; or they may exhibit no definite rate of convergence. If this interests you, then visit https://stats.stackexchange.com/questions/406903 for a sketch of the analysis. – whuber May 20 '19 at 16:21
-
Very true. If the random variables had an exponential rather than uniform distribution (this is the other extreme case) then the variance of the range is $\sum_{k=1}^{n-1}\frac{1}{k^2}$ and it converges to $\frac{\pi^2}{6}$. The proof can be found at https://dsc.news/2JQnPc4 – Vincent Granville May 20 '19 at 16:52
-
2That's not a surprise, given that the range of an exponential distribution is infinite! It therefore helps to restrict the scope of one's interest a little. But the example is still a nice one in that it illustrates how the very slow decrease of the exponential probability towards its "100th percentile" translates to $O(1)$ asymptotic variance. I think this example may help the intuition a little. – whuber May 20 '19 at 17:22
-
Yes, the expectation for the range of $n$ i.i.d. exponential is $\frac{1}{\lambda}\sum_{k=1}^{n-1}\frac{1}{k}$. So it grows to infinity at the same speed as $\log n$. Note that in my previous comment (variance of the range) I omitted the factor $\frac{1}{\lambda^2}$ but there is no way to edit and make the correction. Just consider that $\lambda = 1$ and the result is correct. – Vincent Granville May 20 '19 at 17:28