As you may know, scikit-learn
library utilizes singular value decomposition (SVD) of data matrix X to produce eigenvectors for PCA. I decided to code PCA by using eigenvalue-decomposition of covariance matrix A. Then, I compared the eigenvectors that both ones come up with. All numerical values are correct, but some of the values in eigenvectors are positive in sklearn library, but negative in my PCA or vice versa.
To highlight the phrase opposite directions, I give an example of eigenvectors computed by SVD and Eigen-decomposition:
Eigen-decomposition: [-8.11734515e-06 3.29264421e-03 1.57754708e-04 2.77006684e-05
-9.99994324e-01 -6.40764448e-04 2.05179008e-04 -1.76522227e-04]
SVD: [ 8.11734515e-06 -3.29264421e-03 -1.57754708e-04 -2.77006684e-05
9.99994324e-01 6.40764448e-04 -2.05179008e-04 1.76522227e-04]
I could not figure out the reason for this. Is there anyone who knows what it is happening?