In this old CV post, there is the statement
"(...) I have also shown the transformations to preserve the independence, as the transformation matrix is orthogonal."
It refers to the $k$-dimensional linear transformation $\mathbf y = \mathbf A \mathbf x$ with the (normally distributed) random variables in $\mathbf x$ being assumed independent (the "orthogonal matrix" is $\mathbf A$).
- Does the statement mean that the elements of $\mathbf y$ are jointly independent? If not, what?
- Does the result hinges on the normality of the $\mathbf x$'s?
- Can somebody provide a proof and/or a literature reference for this result (even if it is restricted to linear transformations of normals?)
Some thoughts: Assume zero means. The variance-covariance matrix of $\mathbf y$ is
$${\rm Var}(\mathbf y) = \mathbf A \mathbf \Sigma \mathbf A'$$
where $\Sigma $ is the diagonal variance-covariance matrix of the $\mathbf x$'s. Now, if the variables in $\mathbf x$ have the same variance, $\sigma^2$, and so $\Sigma = \sigma^2 I$, then
$${\rm Var}(\mathbf y) = \sigma^2 \mathbf A \mathbf A' = \sigma^2 I$$
due to orthogonality of $\mathbf A$.
If moreover the variables in $\mathbf x$ are normally distributed, then the diagonal variance-covariance matrix of $\mathbf y$ is enough for joint independence.
Does then the result holds only in this special case (same variance - normally distributed), or it can be generalized, I wonder... my hunch is that the "same variance" condition cannot be dropped but the "normally distributed" condition can be generalized to "any joint distribution where zero covariance implies independence".