3

I am looking for an interpretable measure between two random variables $X$ and $Y$ which quantifies the dependence between the two but does not assume linearity.

Essentially, I am looking for a nonlinear variant of the shared variance. The shared variance is simply $Cor(X,Y)^2$ but is only a good measure of dependence if one can assume that $Y = aX+b+\epsilon$ with $\epsilon \sim N(0,\sigma^2)$. I think most people with a basic statistical training have some kind of intuition what it means if two variable share $x\%$ of their variance.

There are many nonlinear dependence measures, such as distance correlation,mutual information, or Copulas. However, none of them seems to have a straightforward interpretation and they seem to be primarily advocated for testing for independence instead of quantifying the extent of the dependence. As far as I understand mutual information and distance correlation, they essentially both calculate a distance of the estimated joint distribution of $X,Y$ to their joint distribution under independence. In contrast to the shared variance, I doubt that readers with only basic statistical training will be able to grasp what that means.

Julian Karch
  • 1,433
  • 1
  • 13
  • 26

0 Answers0