6

Under what conditions does the below statement hold:

X and Y are uncorrelated if and only if X and Y are independent.

I totally understand that this statement does not always hold, but I would like to know under what specific conditions (for what distribution of X and Y) it will be true.

amoeba
  • 93,463
  • 28
  • 275
  • 317
Saba
  • 125
  • 1
  • 1
  • 5
  • 1
    Do not forget self study tag – zlon Feb 11 '17 at 23:34
  • 1
    It's difficult to determine what you asking. What do you mean by a "special case"? What do you mean by "always holds"? You might be interested in http://stats.stackexchange.com/questions/4364. – whuber Feb 11 '17 at 23:35
  • Sorry for that, however, I was looking for a distribution under which this statement will always hold. – Saba Feb 12 '17 at 00:20

3 Answers3

16

The statement that you are asking about has two parts:

  1. If $X$ and $Y$ are independent, then $X$ and $Y$ are uncorrelated.

  2. If $X$ and $Y$ are uncorrelated, then $X$ and $Y$ are independent.

Statement 1 is always true and imposes no additional constraints on $X$ and $Y$ other than what already has been assumed, viz. that they are independent random variables. Statement 2 does not hold in general, but it does hold if we constrain $X$ and $Y$ to be jointly Gaussian random variables. That is,

2'. If jointly Gaussian random variables $X$ and $Y$ are uncorrelated, then $X$ and $Y$ are independent.

is a true statement, and so

Jointly Gaussian random variables $X$ and $Y$ are uncorrelated if and only if they are independent

is a true statement but

"Random variables $X$ and $Y$ are uncorrelated if and only if they are independent"

does not hold in general. Nor is

"Gaussian random variables $X$ and $Y$ are uncorrelated if and only if they are independent"

a true statement. (Note that in contrast to 2'. the word jointly is missing from the statement). For example, suppose that $X\sim N(0,1)$ and $Z$, independent of $X$ is a Bernoulli random variable with parameter $\frac 12$. Set $Y = (-1)^ZX = \pm X$ and note that $Y \sim N(0,1)$, just like $X$. But, $$E[XY] = E[(-1)^Z X^2] = E[(-1)^Z]E[X^2] = 0 = E[X]E[Y]$$ showing that $X$ and $Y$ are (marginally) Gaussian random variables that are uncorrelated. That they are not independent is easily see because conditioned on the event that $X = x_0$, $Y$ takes on values $x_0$ and $-x_0$ and is thus a discrete random variable instead of continuing to enjoy the standard Gaussian density as it would have if only $X$ and $Y$ were independent random variables. Note that $X$ and $Y$ do not have a jointly Gaussian density.

Finally, if $X$ and $Y$ are Bernoulli random variables or more generally, discrete random variables that take on only two different values, then the statement

Bernoulli random variables (more generally, dichotomous random variables) $X$ and $Y$ are uncorrelated if and only if they are independent

is a true statement. See this question and its answers for some details.

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200
  • 2
    @Saba If this answers your question please mark this answer as accepted (by clicking on a green tick to the left of it). – amoeba Feb 13 '17 at 10:12
2

For a joint distribution function (CDF) constructed as follows

$$H_{X,Y}(x,y)=F_X(x)G_Y(y)\left[1+\alpha\big(1-F_X(x)\big)\big(1-G_Y(y)\big)\right],\;\;\; \alpha >1$$

where $F_X(x)$ and $G_Y(y)$ are any two marginal CDF's,

uncorrelatedness (zero covariance) is equivalent to independence.

This is the "Farlie-Gumbel-Morgenstern" family of joint distributions. For an analysis of the correlation structure, see

Schucany, W. R., Parr, W. C., & Boyer, J. E. (1978). Correlation structure in farlie-gumbel-morgenstern distributions. Biometrika, 65(3), 650-653.

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241
1

The result is only guaranteed to hold when X and Y form a bivariate normal distribution. You will find this in most multivariate analysis texts as well as on some threads on this site.

Michael R. Chernick
  • 39,640
  • 28
  • 74
  • 143
  • 1
    Thank you very much, I was in specific looking for a distribution that let this statement always holds. – Saba Feb 12 '17 at 00:20
  • All I meant to say by my answer was that the bivariate normal is the only case where two univariate random variables X and Y are guaranteed to be independent when they are uncorrelated. Dilip Sawate subsequently added more detail to that. I made a minor edit to my answer to clarify that. – Michael R. Chernick Mar 27 '17 at 12:16