5

In this old CV post, there is the statement

"(...) I have also shown the transformations to preserve the independence, as the transformation matrix is orthogonal."

It refers to the $k$-dimensional linear transformation $\mathbf y = \mathbf A \mathbf x$ with the (normally distributed) random variables in $\mathbf x$ being assumed independent (the "orthogonal matrix" is $\mathbf A$).

  • Does the statement mean that the elements of $\mathbf y$ are jointly independent? If not, what?
  • Does the result hinges on the normality of the $\mathbf x$'s?
  • Can somebody provide a proof and/or a literature reference for this result (even if it is restricted to linear transformations of normals?)

Some thoughts: Assume zero means. The variance-covariance matrix of $\mathbf y$ is

$${\rm Var}(\mathbf y) = \mathbf A \mathbf \Sigma \mathbf A'$$

where $\Sigma $ is the diagonal variance-covariance matrix of the $\mathbf x$'s. Now, if the variables in $\mathbf x$ have the same variance, $\sigma^2$, and so $\Sigma = \sigma^2 I$, then

$${\rm Var}(\mathbf y) = \sigma^2 \mathbf A \mathbf A' = \sigma^2 I$$

due to orthogonality of $\mathbf A$.

If moreover the variables in $\mathbf x$ are normally distributed, then the diagonal variance-covariance matrix of $\mathbf y$ is enough for joint independence.

Does then the result holds only in this special case (same variance - normally distributed), or it can be generalized, I wonder... my hunch is that the "same variance" condition cannot be dropped but the "normally distributed" condition can be generalized to "any joint distribution where zero covariance implies independence".

Alecos Papadopoulos
  • 52,923
  • 5
  • 131
  • 241
  • (1) Are you making a distinction between "jointly independent" and "independent"? (2) Your second and third bullets are answered in many places on this site--a search might help. That's fine, because it narrows your question to the one in the last paragraph. (+1) That sounds remarkably close to assertions made by the [Herschel-Maxwell theorem](http://stats.stackexchange.com/a/24591/919) – whuber Sep 18 '15 at 17:00
  • 1
    @whuber. I think that's the relevant theorem here indeed, thanks. I think I will prepare an answer that details the theorem. it appears to be perhaps the most "natural" characterization of the normal distribution. – Alecos Papadopoulos Sep 18 '15 at 17:20

1 Answers1

0

It generalizes to the case where variances are not the same (heteroskadestic). In that case, the matrix $\Sigma=DD$ where D is a diagonal matrix. You can then eventually reach the conclusion that $Var(y)=\Sigma$.

This result also holds if uncorrelatedness implies independence.