1

SUppose I have $X_1,...,X_k$ independent of each other. I also have $Y_1,...,Y_p$ is independent of each other. If each one in $X_1$,...,$X_k$ is independent of each one in $Y_1$,...,$Y_p$, how to formally prove that $\sum_k X_k$ is independent of each $Y_p$?

Intuitively, this must be true, as what $Y_p$ turns out to be contains no information about the value of each $X_k$, thus it also contains no information about $\sum_k X_k$. How to formally prove this?

T34driver
  • 1,608
  • 5
  • 11

1 Answers1

5

Counterexample: Suppose that $X_1$, $X_2$ and $Y$ take the values $$ \begin{matrix} X_1 & X_2 & Y \\ \hline 0 & 0 & 0 \\ 1 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \\ \end{matrix} $$ with equal probability. Then $X_1$ and $X_2$ are both independent of $Y$ and $X_1$ and $X_2$ are also independent of each other. But if $Y=1$, $X_1+X_2=1$ with probability one whereas if $Y=0$, $X_1+X_2$ takes values of $0$ and $2$ with equal probabilty. So $X_1+X_2$ is not independent of $Y$.

Jarle Tufto
  • 7,989
  • 1
  • 20
  • 36
  • Thanks a lot! This completely solved my problem. Just one more question, where does my initial intuition go wrong? – T34driver Dec 12 '21 at 02:06