1

Given i.i.d. data: $X_1,\dots,X_n$ living in some space $\mathcal{X}$ and drawn according to distribution $P$, and symmetric functions $f,g: \mathcal{X} \times \mathcal{X} \to \mathbb{R}$, I want to estimate the covariance terms:

$$ \gamma = \text{Cov} (f(X,X'), f(X,X'')) $$

and

$$ \theta = \text{Cov} (f(X,X'), g(X,X'')) $$

where $X, X', X''$ are independent draws from $P$. What is the best way (unbiased or consistent estimation) to estimate these quantities from the given data?

Some more context: the $f$ and $g$ functions here are symmetric kernel functions for my U estimators, so $U_n := \frac{1}{{n \choose 2}} \sum_{i<j} f(X_i, X_j)$ and similarly for $g$. My goal is to use results about the asymptotic joint distribution of U statistics (see my question here, in which I state this result) which requires consistent estimation of the covariance quantities listed above. I would just like to understand how in practice such quantities may be estimated that would in turn allow me to use these asymptotic results to build confidence intervals

WeakLearner
  • 1,013
  • 1
  • 12
  • 23
  • It depends on $f,$ $g,$ $P,$ and what you mean by "best." Otherwise about all you can hope for is a recapitulation of the theory of estimation, which is too broad for this site. Could you provide those details, then? – whuber Jan 17 '21 at 18:47
  • 1
    @whuber well in the title I ask for a way to compute an unbiased estimator, i should not have said 'best' later on in the question - this question is based on computing the asymptotic variance for U statistics, the 'f' and 'g' are symmetric kernels for my U statistic of degree 2, and so I am looking for a general result that allows me to estimate these variance terms – WeakLearner Jan 17 '21 at 19:04
  • 1
    @whuber thanks for reopening, I've also added some more clarification in the question – WeakLearner Jan 17 '21 at 19:28

0 Answers0