2

Let $\alpha$ and $\beta$ be jointly distributed by the bivariate normal distribution.

Let A and B be jointly distributed by the standard bivariate normal distribution, where

$A=\frac{\alpha-\mu_\alpha}{\sigma_\alpha}$ and $B=\frac{\beta-\mu_\beta}{\sigma_\beta}$

Is is sufficient to show that A+B and A-B are independent, given $\sigma_A=\sigma_B$, in order to establish that $\alpha+\beta$ and $\alpha-\beta$ are independent if $\sigma_\alpha=\sigma_\beta$?

My intuition is that it should be sufficient as the transformation from A to $\alpha$ and B to $\beta$ are independent of each other. However, I can't find any rigorous proof / theory about this.

Carol Eisen
  • 741
  • 4
  • 15
  • Is this question [self-study](https://stats.stackexchange.com/tags/self-study/info)? In any case, [this answer](https://stats.stackexchange.com/a/261381) may help. – GeoMatt22 Nov 14 '17 at 18:31

1 Answers1

2

If $\alpha$ and $\beta$ are independent random variables, then $A$ and $B$ are also independent (standard normal) random variables regardless of the values of the variances of $\alpha$ and $\beta$, and $A+B$ and $A-B$ are independent random variables too since $A$ and $B$ are independent standard normal random variables. But, as you correctly observe, if $\alpha$ and $\beta$ are independent normal random variables, we cannot simply assert that $\alpha+\beta$ and $\alpha-\beta$ are independent random variables because $$\operatorname{cov}(\alpha+\beta, \alpha-\beta) = \sigma_\alpha^2 - \sigma_\beta^2 \neq 0 ~~ \text{unless}~~ \sigma_\alpha = \sigma_\beta.$$

Dilip Sarwate
  • 41,202
  • 4
  • 94
  • 200