I'm looking to generate correlated random variables. I have a symmetric, positive definite matrix. So I know that you can use the Cholesky decomposition, however I keep being told that this only works for Gaussian random variables?! Is that true?
Furthermore how does this compare to Eigen decomposition. For example using Cholesky decomposition we can write a random parameter as:
$x = \bar{x} + Lz$
where $L$ is the Cholesky decomposition (lower/upper triangular matrix) and $z$ is some vector of random variables. So one can sample the $z$'s and build up a pdf of x. Now we could also use Eigen decomposition and write x as:
$x = \bar{x} + U\lambda^{1\over2}z$
where $\lambda$ is a diagonal matrix of eigenvalues and $U$ is a matrix composed of the eigenvalues. So we could also build a pdf of this. But if we equate these $x$'s we find that $L = U\lambda^{1\over2}$ But this isn’t true as $L$ is triangular and $U\lambda^{1\over2}$ is not?! So I'm really, really confused. So to clarify the questions:
1) For Cholesky decomposition does the vector z have to be only Gaussian? 2) How does the eigenvalue compare with the Cholesky decomposition? They are clearly different factorisation techniques. So I don't see how the $x$'s above can be equivalent?
Thanks, as always, guys.