Given i.i.d. data: $X_1,\dots,X_n$ living in some space $\mathcal{X}$ and drawn according to distribution $P$, and symmetric functions $f,g: \mathcal{X} \times \mathcal{X} \to \mathbb{R}$, I want to estimate the covariance terms:
$$ \gamma = \text{Cov} (f(X,X'), f(X,X'')) $$
and
$$ \theta = \text{Cov} (f(X,X'), g(X,X'')) $$
where $X, X', X''$ are independent draws from $P$. What is the best way (unbiased or consistent estimation) to estimate these quantities from the given data?
Some more context: the $f$ and $g$ functions here are symmetric kernel functions for my U estimators, so $U_n := \frac{1}{{n \choose 2}} \sum_{i<j} f(X_i, X_j)$ and similarly for $g$. My goal is to use results about the asymptotic joint distribution of U statistics (see my question here, in which I state this result) which requires consistent estimation of the covariance quantities listed above. I would just like to understand how in practice such quantities may be estimated that would in turn allow me to use these asymptotic results to build confidence intervals