Suppose I have a $d \times n$ matrix $\mathbf X$ (each entry point has $d$ dimensions) and after some manipulation of data (i.e. summarizing the data $\mathbf X$) I get its $d \times d$ symmetric, quadratic correlation matrix $\rho$ (defined by Pearson).
Then by SVD definition we know that matrix $\mathbf X$ can be decomposed in three matrices:
$\mathbf X = \mathbf U \mathbf \Sigma \mathbf V^\top$
Hence here is the question:
I want to know if its correct to suppose that
$\mathbf X \mathbf X^\top = \rho$
then doing some algebra we can get:
$\rho = (\mathbf U \mathbf \Sigma \mathbf V^\top)(\mathbf V \mathbf \Sigma \mathbf U^\top)$
so
$\mathbf X \mathbf X^\top = \rho = \mathbf U \mathbf \Sigma^2 \mathbf U^\top$
I want to know if this equivalence is correct, or in which cases is correct? Can anyone give an explanation about it?.
Note I am using correlation Matrix instead of covariance matrix, (we know we can get eigenvalues and eigenvectors from a correlation matrix, and that would solve PCA)