9

Let X, Y and Z be three independent random variables. If X/Y has the same distribution as Z, is it true that X has the same distribution as YZ?

Juho Kokkala
  • 7,463
  • 4
  • 27
  • 46
user2808118
  • 523
  • 2
  • 10
  • 4
    No. Consider the case of $X$ and $Y$ being standard normal and $Z$ a standard Cauchy random variable (with all three being independent as per the premise of the question). It is well-known that $X/Y$ has a standard Cauchy distribution (the same as that of $Z$), but $YZ$ does not have a standard normal distribution (since $E[YZ]$ does not exist). So you do need additional restrictions on $X, Y, Z$ (cf. Silverfish's answer) to have any hope of finding examples where the result might hold. – Dilip Sarwate Jan 15 '15 at 14:46
  • 1
    @Dilip I considered using that as my counterexample but shied away from it because I couldn't think of a brief explanation of why $E[YZ]$ doesn't exist. If you've got a neat argument, you should post it as an answer I think. (As you can probably tell, I quite deliberately avoided zeroes and infinities in my answer, so I was very keen to avoid something that isn't even infinite!) – Silverfish Jan 15 '15 at 14:56
  • @Silverfish Will the following work? "Since $Y$ and $Z$ are independent, $E[YZ]$, if it exists, must equal $E[Y]E[Z]$ (provided $E[Y]$ and $E[Z]$ exist)." – Dilip Sarwate Jan 15 '15 at 15:17
  • 2
    @Dilip Since $Z$ is Cauchy, so $E[Z]$ doesn't exist, it seems to me the proviso is not met and the statement says nothing about $E[YZ]$. For comparison: if $Z$ is Cauchy and $Y$ has the degenerate distribution $P(Y=0)=1$, then it would appear $E[YZ]$ exists (and equals zero) even though $E[Z]$ doesn't. – Silverfish Jan 15 '15 at 15:23
  • 4
    One of the simplest, and perhaps most intuitive, possible counterexamples is to let $X=1$ and $Y$ be any distribution with some chance of not being in $\{-1,0,1,\pm\infty\}$ (since $\pm 1$ are the fixed points of $y\to 1/y$ and $0,\infty,$ and $-\infty$ are problematic in the definition of $X/Y$ in any event). Then $YZ$ obviously is not constant while $X$ is. – whuber Jan 15 '15 at 16:11
  • 3
    @Silverfish $E[YZ]$ is defined only if $E[|YZ|]$ is finite. But, $E[|YZ|]=E[|Y|\cdot |Z|]=E[|Y|]E[|Z|]$ since $|Y|$ and $|Z|$ are independent random variables. But, since $E[|Z|]$ is not finite and $E[|Y|]>0$, we conclude that $E[|YZ|]$ is not finite (there are no issues about the value of $0\times\infty$). Consequently, $E[YZ]$ is not defined (or does not exist) whereas $E[X]$ very definitely does exist and has value $0$. – Dilip Sarwate Jan 15 '15 at 19:25
  • @Dilip Very neat! And worthy of an answer. (For any later readers: Dilip's point negates my counterexample to his prior post. $Y\sim N(0,\sigma^2) \implies E(|Y|)=\sigma \sqrt{2/\pi}>0$ - see [half-normal distribution](http://en.wikipedia.org/wiki/Half-normal_distribution), but what matters is it's >0. I had $P(Y=0)=1$ so $E(|Y|)=0$ which puts us in $0 \times \infty$ territory. For more on moment existence, see [Cardinal's great answer](http://stats.stackexchange.com/q/32706/22228) or this [handy moments cribsheet](http://www2.cirano.qc.ca/~dufourj/Web_Site/ResE/Dufour_2008_C_TS_Moments.pdf)) – Silverfish Jan 15 '15 at 22:54
  • @whuber I like that. I don't even think it's necessary to rule out $Y=\pm 1$. Suppose $P(X=1)=1$ and that $Y$ has positive probability of taking two finite non-zero values, $a$ and $b$. Then $X/Y$ has positive probability of taking $a^{-1}$ and $b^{-1}$ (also finite and non-zero) and so must $Z$. Hence $YZ$ has positive probability of taking the value $ab^{-1} \neq 1$. – Silverfish Jan 15 '15 at 23:27
  • @Silver Thank you. I just didn't want readers to even have to think about such cases, so it was simpler (in the limited space provided) to rule out any value that might seem to cause problems. My goal was to give a counterexample that was immediately obvious. – whuber Jan 16 '15 at 00:01

1 Answers1

8

It can happen. For instance, if $X$, $Y$ and $Z$ are independent Rademacher variables, i.e. they can be 1 or -1 with equal probability. In this case $X/Y$ is also Rademacher, so has the same distribution as $Z$, while $YZ$ is Rademacher so has the same distribution as $X$.

But it won't happen in general. So long as the means exist, necessary (but not sufficient) conditions for $X/Y$ to have the same distribution as $Z$, and for $YZ$ to have the same distribution as $X$, would be: $$\mathbb{E}(Z) = \mathbb{E}(XY^{-1}) = \mathbb{E}(X)\mathbb{E}(Y^{-1})$$ $$\mathbb{E}(X) = \mathbb{E}(YZ) = \mathbb{E}(Y)\mathbb{E}(Z)$$

The second equalities followed by independence. Substituting gives: $$\mathbb{E}(Z) = \mathbb{E}(Y) \mathbb{E}(Z) \mathbb{E}(Y^{-1})$$

If $\mathbb{E}(Z) \neq 0$ then $1 = \mathbb{E}(Y) \mathbb{E}(Y^{-1})$, or equivalently, so long as $\mathbb{E}(Y) \neq 0$,

$$\mathbb{E}(Y^{-1}) = \frac{1}{\mathbb{E}(Y)}$$

This is not true in general. For example, let $Y$ be a translated Bernouilli variable which takes values $1$ or $2$ with equal probability, so $\mathbb{E}(Y)=1.5$. Then $Y^{-1}$ takes values $1$ or $0.5$ with equal probability, so $\mathbb{E}(Y^{-1})=0.75 \neq 1.5^{-1}$. (I leave it to the reader's imagination, how dramatic an effect it would have had to use an untranslated Bernouilli variable instead, or one translated only slightly so it is very close to 0 with probability one half. Note that in the Rademacher example there was no problem here because all three expectations were zero, note further that this condition isn't a sufficient one.)

We can explore how this $Y$ fails by constructing a more explicit counterexample. To keep things simple, suppose $X$ is a scaled Bernouilli and takes values $0$ or $2$ with equal probability. Then $X/Y$ is either $0/1$, $0/2$, $2/1$ or $2/2$ with equal probability. It's clear that $P(X/Y=0)=\frac{1}{2}$, $P(X/Y=1)=\frac{1}{4}$ and $P(X/Y=2)=\frac{1}{4}$. Let $Z$ be an independent variable drawn from the same distribution. What is the distribution of $YZ$? Is it the same as the distribution of $X$? We don't even have to work out the full probability distribution to see that it can't be; it suffices to remember $X$ could only be zero or two while $YZ$ can take any value you can obtain from multiplying one of $\{1,2\}$ by one of $\{0,1,2\}$.

If you want a moral for this tale, then try playing around with scaled and translated Bernouilli variables (which includes Rademacher variables). They can be a simple way to construct examples - and counterexamples. It helps having fewer values in the supports so that distributions of various functions of the variables can be easily worked out by hand.

Even more extreme we can consider degenerate variables which only have a single value in their support. If $X$ and $Y$ are degenerate (with $Y\neq 0$) then $Z=X/Y$ will be too, and so the distribution of $YZ$ will match the value of $Z$. Like my Rademacher example, that's a situation showing your conditions can be satisfied. If instead, as @whuber suggests in the comments, we let $X$ be degenerate with $P(X=1)$, but allow $Y$ to vary, then constructing an even simpler counterexample is very easy. If $Y$ can take two finite, non-zero values - $a$ and $b$, say - with positive probability, then $X/Y$, and hence $Z$, can take values $a^{-1}$ and $b^{-1}$. Now $YZ$ therefore has $ab^{-1} \neq 1$ in its support, so can't follow the same distribution as $X$. This is similar to, but simpler than, my argument that the supports couldn't match in my original counterexample.

Silverfish
  • 20,678
  • 23
  • 92
  • 180
  • I hope the downvoting reflects dissatisfaction with my exposition, rather than there being a flaw in my reasoning. If anyone spots an error, let me know. – Silverfish Jan 15 '15 at 15:12
  • 1
    $\newcommand\E{\mathbb{E}}$Suppose that $\Pr(Y > 0) = 1$. Then, since $1/x$ is a convex function on $(0, \infty)$, Jensen's inequality tells us that the condition $\E Y = \E \frac{1}{Y}$ holds only if $Y$ is degenerate. The same is true if $\Pr(Y < 0) = 1$, in which case 1/x is concave. So if $Y$ is of fixed sign but not degenerate, the necessary condition cannot hold. – Danica Jan 15 '15 at 23:37
  • 1
    @Dougal Thanks for mentioning this. When writing up, I thought about including it but felt the discussion of signs etc would break the flow. I thought about just saying "see Jensen's inequality" and adding a Wikipedia or similar link, but then decided that wasn't a good idea because I hadn't prefaced it by the convexity conditions I was trying to avoid. Instead, I had a look to see whether there is somewhere (maybe a CV thread) where expectation of non-linear functions of a RV is discussed in general, which would naturally lead a curious reader to Jensen, but I didn't spot anything I like yet. – Silverfish Jan 15 '15 at 23:44
  • 2
    @Dougal This is one of those times there's a bit of a clash between beautifully simple counterexamples - something very easily computed, so someone who laboured under a misapprehension can immediately see it is impossible or incorrect - and a more thorough, general treatment which actually helps show under what conditions something might actually hold (but which may be too hard for some readers to follow, and therefore less convincing to them). The RV on $\{1,2\}$ shows even a beginner why $E(1/Y)$ doesn't work as nicely as $E(aY+b)$ but Jensen says much more about why! – Silverfish Jan 15 '15 at 23:52
  • 2
    Yep, good point, though I am curious about conditions on when this (seemingly natural) relation can hold, which seem to be quite limited. Note that in my comment above, I miswrote the condition: it of course should be $\frac{1}{\E Y} = \E \frac{1}{Y}$. – Danica Jan 15 '15 at 23:56
  • 2
    @Dougal I think beyond degenerate RVs such relationships are not as "natural" as they first appear. Consider $Z$ has same distribution as $X+Y$ and $Y$ has same distribution as $Z-X$, and all three are independent ... Again it doesn't hold in general. – Silverfish Jan 18 '15 at 14:53