2

We know that $cov(X,Y)=0$ does not warranty $X$ and $Y$ are independent. But if they are independent, their covariance must be $0$.

My question is: what kind of distribution must $X$ and $Y$ be for there to be a proof that " $cov(X,Y)=0$ -> that $X$ and $Y$ are independent" ? (I mean how to prove "if $cov(X,Y)=0$ so $X$ and $Y$ are independent?)

Peter Flom
  • 94,055
  • 35
  • 143
  • 276
  • Look [here](https://stats.stackexchange.com/a/12844/213806) for a primer with a specific example. It is not an answer to this question but it is a nice answer to a similar question. – ERT Aug 10 '18 at 00:02
  • 2
    It's not the case in general; you can't prove it when it isn't the case. There are some specific cases (e.g. bivariate normal) where if you know you have that particular joint distribution then covariance 0 implies independence. – Glen_b Aug 10 '18 at 01:20
  • 1
    The second paragraph seems to contradict the first paragraph. – Peter Flom Aug 10 '18 at 12:08
  • 2
    Possible duplicate of [Why zero correlation does not necessarily imply independence](https://stats.stackexchange.com/questions/179511/why-zero-correlation-does-not-necessarily-imply-independence) – kjetil b halvorsen Aug 11 '18 at 19:16

1 Answers1

1

Early analysis of this question can be found in Lancaster (1951) and Leipnik (1961). In the latter paper the author analyses the conditions required for uncorrelatedness to imply independence in a bivariate continuous distribution. If you have access to scholarly journals (e.g., through a university) then I recommend starting with these papers. I will also give a bit of insight for a special case.


It is worth noting that independence of $X$ and $Y$ is equivalent to the following moment condition:

$$\mathbb{E}(h(X) g(Y)) = \mathbb{E}(h(X)) \mathbb{E}(g(Y)) \quad \text{for all bounded measureable } g \text{ and } h. $$

If both random variables have bounded support then the Stone-Weierstrass theorem allows us to uniformly approximate the functions $g$ and $h$ with polynomials, which gives the equivalent condition:

$$\mathbb{E}(X^n Y^m) = \mathbb{E}(X^n) \mathbb{E}(Y^m) \quad \text{for all } n \in \mathbb{N} \text{ and } m \in \mathbb{N}. \quad \quad \quad$$

(The case $n=m=1$ implies zero correlation, and the other cases are conditions for higher-order moment separability.) Thus, for the case of random variables with bounded support, correlation of zero, plus higher-order moment separability at all orders is equivalent to independence.

Ben
  • 91,027
  • 3
  • 150
  • 376