2

Suppose I have $N$ random variables $\{X_j\}_{j=1}^N$ and they are mutually independent. Also, I define $Z = f(X_1,\cdots,X_N)$ for some function $f()$. And I want to know that if it is possible that $X_j$ is independent of $X_k$ conditioning on $Z$ for any $j\ne k$.

I have asked a similar question where $N=2$ and $f(X_1,X_2) = X_1+X_2$, and the statement is false. I want to know if the statement is possibly true for general $N$ and some function $f(\cdot)$. Here I assume $\{X_j\}$ are not constant random variables.

user1292919
  • 637
  • 3
  • 8

1 Answers1

3

Yes, it's possible.

Suppose $X_i$ are Rademacher variables ($\pm 1$ with equal probability) and $Z=\sum_i X_i^2$. Then $Z$ is constant, so conditioning on it has no effect.

Less trivially, suppose $Z=\prod_i X_i$ and $N>2$. For any specific $j, k$, $$P(Z=1|X_j,X_k)=1/2$$ so the distribution of $(X_j,X_k)|Z$ is the same as their unconditional distribution. In this situation, any $N$ of $(X_1,X_2,\dots,X_N,Z)$ are mutually independent but the full set is not.

As a slight extension, suppose $X_i\sim N(0,1)$, and $Z=\prod_i \mathrm{sign} X_i$. By the same argument as before, $Z$ is independent of any set of fewer than $N$ $X_i$s.

I can't think of an example where the $X_i$ are all continuous and $Z$ is a continuous, non-constant function of all of them. I'd be a bit surprised if there was one, but not very surprised.

Thomas Lumley
  • 21,784
  • 1
  • 22
  • 73