2

Suppose I have n order statistics from some unknown continuous distribution funciton F(x), $X_{1}\leqslant X_{2}\leqslant...\leqslant X_{n}$. And I have two linear combination of these order statistics

  1. $S_{n}=n^{-1}\sum_{i=1}^{n}J_{1}(\frac{i}{n})X_{i}$ in the same manner as in Stigler (1974)
  2. $T_{n}=n^{-1}\sum_{i=1}^{n}J_{2}(\frac{i}{n})X_{i}$ in the same manner as in Stigler (1974) but with different function form of $J_{2}(\frac{i}{n})$

We assume that $S_{n}$ and $T_{n}$ are well behaved as in Stigler (1974) and both are asymptotically normal distributed such that $S_{n}\sim N(s,var1)$ and $T_{n}\sim N(t,var2)$.and depending on Stigler we know how to find var1 and var2.

I believe they are joint asymptotically multivariate normal distributed. My question is that how could I obtain the asymptotic covariance of $S_{n}$ and $T_{n}$ ? The original proof of Stigler only provides univariate case but not joint distribution.

Thank you so much.

Reference (1) Stigler (1974), Linear Functions of Order Statistics with Smooth Weight Functions, Ann Stat.

lzstat
  • 241
  • 1
  • 10
  • I can't tell from the information given here, but would Stigler's method enable you to find the asymptotic variances of the *univariate* variables $S_n\pm T_n$ (both of which are linear combinations of order statistics)? If so, you could use the [polarization identity](http://stats.stackexchange.com/a/142472/919) to find the covariance. – whuber Dec 07 '15 at 15:33
  • 2
    @whuber, I think your reason is correct. Yes, the variance of $S_{n}+(-)T_{n}$ can be obtained in a similar manner as Stigler did or in Robert Serfling book "approximation theorem of mathematical statistics" pages 276. Then if they are asymptotic jointly normality, then it should be the covariance I need in the covariance matrix I think. Your reason provides a route to calculate it . thx – lzstat Dec 07 '15 at 17:06

0 Answers0