In Principal component analysis, why the symmetric covariance matrix's eigenvector points in the most “significant” direction of the data set?
Asked
Active
Viewed 552 times
0
-
What exactly don't you understand? Do you know what an eigenvector is? It's not just *any* eigenvector that gives the "**most** significant direction." It's an eigenvector associated with the largest eigenvalue. Possible duplicate: http://math.stackexchange.com/questions/23596/why-is-the-eigenvector-of-a-covariance-matrix-equal-to-a-principal-component?rq=1 – symplectomorphic May 14 '16 at 01:55
-
@symplectomorphic : in your link they don't say that it is because the $k$ first eigenvectors are the solution of $\min_P \|X - X P P^T \|^2_F$ subject to $ P : n \times k \quad$ (with $ X : m \times n$ being the matrix of your $m$ points) – reuns May 14 '16 at 02:06