7

I've seen the statement:

It's possible that random variables $X_i, X_j$ are independent for $i≠j$, but $X_1, X_2, X_3$ are dependent.

I haven't been able to find examples of this though.

Any examples?

JohnK
  • 18,298
  • 10
  • 60
  • 103
mavavilj
  • 3,167
  • 4
  • 20
  • 44
  • 3
    This question has been answered many times on stats.se. Search for "pair wise independent" random variables. – Dilip Sarwate Nov 08 '15 at 12:46
  • 1
    Actually, I must withdraw my comment above. This question has not been explicitly asked previously, but some answers to other, related, questions (e.g. [this one](http://stats.stackexchange.com/q/21429/6633) and [this other one](http://stats.stackexchange.com/q/173192/6633)) do discuss this issue. – Dilip Sarwate Nov 09 '15 at 14:51
  • I gave an example which is essentially the same one as in the answer by JohnK. I have edited the answer to include another example; _standard normal random variables_ that are _pairwise_ independent but _not_ mutually independent: their joint density is _not_ the product of the univariate marginal densities. – Dilip Sarwate Nov 09 '15 at 22:58
  • $X_1,X_2\sim\text{i.i.d.}$ and each equal to $0$ or $1$ with equal probabilties, and $X_3$ is the mod-$2$ sum of $X_1$ and $X_2.$ Then $X_1, X_3 \sim \text{i.i.d.}$ and $X_2,X_3\sim\text{i.i.d.}$ but obviously $X_1, X_2, X_3$ are not independent. – Michael Hardy Nov 10 '18 at 16:36

2 Answers2

10

Here is an example of this, attributed to S. Benstein.

Let $X_1, X_2, X_3$ have the joint pmf

$$p\left(x_1, x_2, x_3 \right) =\begin{cases} \frac{1}{4} & \left(x_1, x_2, x_3 \right) \in \left\{ (1,0,0), (0,1,0), (0,0,1), (1,1,1) \right\} \\ 0 & \text{otherwise} \end{cases}$$

Then by summing out the third variable it is easy to see that the joint pmf of $X_i$ and $X_j$, $i\neq j$ is

$$p_{ij} (x_i, x_j)= \begin{cases} \frac{1}{4} & (x_i, x_j) \in \left\{ (0,0), (1,0), (0,1), (1,1) \right\} \\ 0 & \text{otherwise} \end{cases} $$

Finally, the marginal pmf of $X_i$ is

$$p_i(x_i) = \begin{cases} \frac{1}{2} & x_i= 0, 1 \\ 0 & \text{otherwise} \end{cases}$$

Now, note that for $i\neq j$

$$p_{ij} (x_i, x_j) =p_i (x_i) p_j (x_j)$$

and thus $X_i$ and $X_j$ are independent. However

$$p(x_1, x_2, x_3) \neq p_1 (x_1) p_2 (x_2) p_3 (x_3)$$

and so $X_1, X_2, X_3$ are not independent. Thus pairwise independence does not imply mutual independence. The latter is a stronger condition and it's usually the one we use with random samples.

JohnK
  • 18,298
  • 10
  • 60
  • 103
  • Why is the joint p.m.f of $p(x1,x2,x3)$ not $\frac{1}{2}\frac{1}{2}\frac{1}{2}=\frac{1}{8}$? If one'd try to "induce" it from the single variable p.m.f? Also what does "summing out" mean? – mavavilj Nov 08 '15 at 17:27
  • @mavavilj It's not $1/8$ because there are only 4 equilikely tuples. It would have been $1/8$ if we would allow all possible configurations, in which case there would be $2^3$ outcomes. – JohnK Nov 08 '15 at 17:30
  • But they aren't coherent then? I mean, if the image of $p_i(x_i)$ is $\{0,1\}$ and the image of $p_{ij}(x_i,x_j)$ is $\{0,1\} \times \{0,1\}$, then why is the image of $p_{ijk}(x_i,x_j, x_k)$ not $\{0,1\} \times \{0,1\}\times \{0,1\}$? I'm thinking in the "inductive" way here. – mavavilj Nov 08 '15 at 17:37
  • @mavavilj It doesn't have to be like that. You are not taking the dependence into account. – JohnK Nov 08 '15 at 17:45
  • If one starts from $p_i(x_i)$ and induces, then is there a dependence? – mavavilj Nov 08 '15 at 17:50
  • 1
    @mavavilj It seems to me that you are confused with the concept of "marginal" distribution. I would urge you to study a little more the relation between joint and marginal distributions. – JohnK Nov 08 '15 at 17:53
9

$X$, $Y$ independent Bernoulli$(\frac 12)$ and $Z= X+Y-2XY$ is an example of three random variables that are pairwise independent but not mutually independent.

It is easy to show that $Z$ is also Bernoulli$(\frac 12)$ and that $(X,Z)$ and $(Y,Z)$ are pairs of independent random variables, and of course, $(X,Y)$ is a pair of independent random variables by assumption. (If you feel too lazy to carry this out for yourself, note that the answer by @JohnK essentially uses $X_1=X, X_2=Y, X_3 = 1-Z$). Thus, $X,Y,Z$ are said to be pairwise independent random variables. However, for $X,Y,Z$ to be called mutually independent random variables, their joint probability mass function must factor into the product of the individual (marginal) probability mass functions, that is, if $X, Y, Z$ take on values in the sets $\{x_i\}, \{y_j\}, \{z_k\}$ respectively, then

$X,Y,Z$ are said to be mutually independent random variables if for all choices of $x_i, y_j, z_k$, $$P\{X=x_i, Y=y_j, Z = z_k\} = P\{X=x_i\}P\{Y = y_j\}P\{Z = z_k\}.$$

In the example above, it is easy to verify that $$P\{X=1,Y=1,Z=1\} = 0 \neq \frac 18 = P\{X=1\}P\{Y=1\}P\{Z=1\}$$ and so $X,Y,Z$ cannot be called mutually independent random variables.


Lest you think that it is necessary to use discrete random variables to have examples such as the one above, consider three standard normal random variables $X,Y,Z$ whose joint probability density function $f_{X,Y,Z}(x,y,z)$ is not $\phi(x)\phi(y)\phi(z)$ where $\phi(\cdot)$ is the standard normal density, but rather

$$f_{X,Y,Z}(x,y,z) = \begin{cases} 2\phi(x)\phi(y)\phi(z) & ~~~~\text{if}~ x \geq 0, y\geq 0, z \geq 0,\\ & \text{or if}~ x < 0, y < 0, z \geq 0,\\ & \text{or if}~ x < 0, y\geq 0, z < 0,\\ & \text{or if}~ x \geq 0, y< 0, z < 0,\\ 0 & \text{otherwise.} \end{cases}\tag{1}$$ Note that $X$, $Y$, and $Z$ are not a set of three jointly normal random variables but as will be described below, any two of these is indeed a pair of independent normal random variables.

We can calculate the joint density of any pair of the random variables, (say $X$ and $Z$) by integrating out the joint density with respect to the unwanted variable, that is, $$f_{X,Z}(x,z) = \int_{-\infty}^\infty f_{X,Y,Z}(x,y,z)\,\mathrm dy. \tag{2}$$

  • If $x \geq 0, z \geq 0$ or if $x < 0, z < 0$, then $f_{X,Y,Z}(x,y,z) = \begin{cases} 2\phi(x)\phi(y)\phi(z), & y \geq 0,\\ 0, & y < 0,\end{cases}$ and so $(2)$ reduces to $$f_{X,Z}(x,z) = \phi(x)\phi(z)\int_{0}^\infty 2\phi(y)\,\mathrm dy = \phi(x)\phi(z). \tag{3}$$

  • If $x \geq 0, z < 0$ or if $x < 0, z \geq 0$, then $f_{X,Y,Z}(x,y,z) = \begin{cases} 2\phi(x)\phi(y)\phi(z), & y < 0,\\ 0, & y \geq 0,\end{cases}$ and so $(2)$ reduces to $$f_{X,Z}(x,z) = \phi(x)\phi(z)\int_{-\infty}^0 2\phi(y)\,\mathrm dy = \phi(x)\phi(z). \tag{4}$$

In short, $(3)$ and $(4)$ show that $f_{X,Z}(x,z) = \phi(x)\phi(z)$ for all $x, z \in (-\infty,\infty)$ and so $X$ and $Z$ are (pairwise) independent standard normal random variables. Similar calculations (left as an exercise for the bemused reader) show that $X$ and $Y$ are (pairwise) independent standard normal random variables, and $Y$ and $Z$ also are (pairwise) independent standard normal random variables. But $X,Y,Z$ are not mutually independent normal random variables. Nor are the three of them together a set of jointly normal random variables. Indeed, their joint density $f_{X,Y,Z}(x,y,z)$ does not equal the product $\phi(x)\phi(y)\phi(z)$ of their marginal densities for any choice of $x, y, z \in (-\infty,\infty)$

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200
  • Incredible and clear example with the normal distributions, thanks for sharing! It has a very intuitive geometric picture as well - for me it was easy to imagine why this example worked by observing how lines parallel to the coordinate axis hit the support of the mass function. I think it also suggests how to construct similar examples for n normal random variables, with (n-1)-wise independence, but not full independence ... though its not clear to me what Boolean logical formula to use to pick the support. Any suggestions? – Elle Najt May 01 '17 at 05:31