4

Let $X$, $Y$ be two independent random variables defined on the real line by a fixed distribution $\mathcal{D}$. Let $Z = XY$, the product of $X$ and $Y$.

For which choices of $\mathcal{D}$ is $Z$ uncorrelated with the individual variables $X, Y$, so $\rho_{Z, X} = \frac{\text{cov}\{Z, X\} } {\sigma_{Z} \sigma_{X}} = 0$?

As an example, for $X, Y \sim \mathcal{N}(0,1)$

> N <- 1000000
> set.seed(1)
> x <- rnorm(N)
> y <- rnorm(N)
> xy <- x*y
> cor(xy, x)
[1] -0.0001072026

this seems to be the case.

I would be grateful for pointers to (i) a derivation of this for the Gaussian case and (ii) any results regarding the characteristics of the density that determine whether this independence holds.

jmb
  • 645
  • 1
  • 5
  • 13
  • 2
    If the distribution is centred at zero and symmetric around zero, with $X$ and $Y$ independent then $XY$ and $X$ are uncorrelated, but not necessarily independent. – Xi'an Apr 12 '18 at 11:38
  • @Xi'an thank you! do you have a reference by any chance? – jmb Apr 12 '18 at 11:39
  • It's clearly not the case that the product of two independent Gaussians is independent of either one: the variance of the product, conditional on one of its factors, obviously changes. Your question is confusing because although you use the term "independent," your formula expresses a condition of *zero correlation.* Which one are you trying to ask about? – whuber Apr 12 '18 at 13:45
  • 1
    There is no derivation for Gaussian variables because the result is rarely true for them. However, when $E(X)=E(Y)=0$ the result will always be (trivially) true, regardless of the distributions of $X$ and $Y.$ – whuber Apr 12 '18 at 14:25

1 Answers1

3

For iid random variables $X$ and $Y$ with finite variance, set $Z = XY$, and note that $Z$ also has finite variance as can be deduced, for example, from the formulas in this question and its answers. Consequently, $\rho(Z,X)$ equals $0$ if and only if $\operatorname{cov}(Z,X)$ equals $0$. But, \begin{align} \operatorname{cov}(Z,X) &= E[ZX]-E[Z]E[X]\\ &= E[X^2Y]-E[XY]E[X]\\ &= E[X^2]E[Y]-E[X]E[Y]E[X] &\scriptstyle{X,Y}~\text{independent}\implies X^2,Y~\text{also independent}\\ &= \left(E[X^2]-(E[X])^2\right)E[Y]\\ &= \sigma_X^2E[Y]\\ &=0 ~~\text{if and only if }~\sigma_X^2 = 0 ~ \text{or} ~E[Y]=0. \end{align} Similarly, $\operatorname{cov}(Z,Y)=0$ if and only if $\sigma_Y^2 = 0$ or $E[X]=0$. Of course, since $X$ and $Y$ are identically distributed, $\sigma_X^2=\sigma_Y^2$ and $E[X]=E[Y]$. In short, what @whuber's comment calls a "(trivially) true" condition for the result to hold is not just sufficient but also necessary except in the special case when the common distribution has zero variance and so both $X$ and $Y$ almost surely equal the same constant (which is also their common expected value) and this constant (a.k.a. $E[X]=E]Y]$) need not be $0$.

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200
  • You don't seem to need symmetric distributions for your proof. However, Xi'an mentioned symmetric distributions as a requirement above in the comments and also this paper https://amstat.tandfonline.com/doi/pdf/10.1080/10691898.2011.11889620 (page 8) says that cor(X, XY) = 0 is only guaranteed for symmetric distributions X, Y (however, I cannot access the reference they provide for that). Could you comment on that? – jmb May 09 '18 at 17:17
  • 1
    @jmb Symmetric distribution is a _sufficient_ condition, and not a necessary one. It implies, among other things, that $E[X]=E[Y]=0$ which is a _weaker_ sufficient condition, as pointed out by whuber in a comment on your question. I too do not have access to the reference you cite, and so cannot say why the author(s) make that claim. – Dilip Sarwate May 09 '18 at 19:08