4

Let $\mathbf{X} = (X_1, \dots, X_p)^\top$ and $\mathbf{Y} = (Y_1, \dots, Y_p)^\top$ be independent. Does it then follow that $X_i$ is independent with $Y_j$ i.e. cov$(X_i, Y_j) = 0$?

gunes
  • 49,700
  • 3
  • 39
  • 75
Abdul Miah
  • 113
  • 1
  • 5
  • Since (obviously) the components are functions of the vectors, the duplicate says it all. Please note that independence and zero covariance are not equivalent: independence implies zero covariance *if the covariance exists* and zero covariance does not generally imply independence. – whuber Feb 12 '19 at 20:25

2 Answers2

9

If the two vectors are indepdendent, we have $p(\textbf{X,Y})=p(\textbf{X})p(\textbf{Y})$. Considering a specific pair $X_i$,$Y_j$, $$\begin{align}p(X_i,Y_j) &=\int_{X_k,k\neq i}\int_{Y_m,m\neq j}P(\textbf{X},\textbf{Y})\\ &=\int_{X_k,k\neq i}\int_{Y_m,m\neq j}P(\textbf{X})P(\textbf{Y})\\ &=\int_{X_k,k\neq i}P(\textbf{X})\int_{Y_m,m\neq j}P(\textbf{Y}) \\ &=P(X_i)P(Y_j)\end{align}$$ So, they're independent, which means $\operatorname{cov}(X_i,Y_j)=0$. But, having $\operatorname{cov}(X_i,Y_j)=0$ doesn't mean that the two are independent, as you asked.

gunes
  • 49,700
  • 3
  • 39
  • 75
7

In addition to the answer by @gunes, here it is better to use the definitions directly. Two random variables (or vectors, as in this case) $\mathbf{X}, \mathbf{Y}$ are independent if all events determined by $\mathbf{X}$ are independent from all events determined by $\mathbf{Y}^\dagger$.

But an event determined by $X_i$ is certainly (indirectly) determined by $\mathbf{X}$. So the conclusion follows directly from the definition, without any need for integration or summation.

$^\dagger$ events determined by $\mathbf{X}$ means *member of the $\sigma$-algebra generated by $\mathbf{X}$.

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467