Intuitively, two random variables $X$ and $Y$ are independent if knowing the value of one of the random variable provides zero information about the other. The same holds true for two random vectors $\mathbf{X}=(X_1,X_2,\cdots, X_m), \mathbf{Y}=(Y_1,Y_2,\cdots, Y_n)$. But does it also mean that $\mathbf{X}$ and $\mathbf{Y}$ are independent componentwise? I mean, is $X_i$ independent of $Y_j$ for every $1\le i\le m$ and $1\le j\le n$?
2 Answers
Yes, $X_i$ is independent of $Y_j$. To see this, note that if $\mathbf{X}$ and $\mathbf{Y}$ are independent, then for functions $f$ and $g$, $f(\mathbf{X})$ and $g(\mathbf{Y})$ are independent. See discussion here for this statement.
So let $f$ be the function that picks out the $i$th element of $\mathbf{X}$, that is, $f(\mathbf{X}) = X_i$ and similarly define $g(\mathbf{Y}) = Y_j$. Then $X_i$ is independent of $Y_j$.

- 14,131
- 3
- 36
- 80
It is actually even more general than that.
For real valued random vectors $X = (X_1,X_2,...,X_m)$ and $Y = (Y_1,Y_2,...,Y_n)$ independence implies:
$$F_{X_1,X_2,...X_m,Y_1,Y_2,...,Y_n} (x_1,x_2,...,x_m,y_1,y_2,...,y_n)=F_{X_1,X_2,...X_m}(x_1,x_2,...,x_m) F_{Y_1,Y_2,...,Y_n}(y_1,y_2,...,y_n)$$
where $F$ is cumulative distribution function.
Now plugging $\infty$ to any variable into the above formula lets us take any subset of the variables and claim independence among subsets, not only between component pairs.

- 3,716
- 2
- 29
- 55
-
+1: very clear and elegant. – whuber Dec 28 '20 at 13:54