I am comparing the output from the singular value decomposition with the eigendecomposition of the covariance matrix (symmetric matrix). I am expecting that the Eigenvector and a non-diagonal matrix of SVD to be similar, yet they are not. I found that I need to flip and multip one column by -1. Pricnc[pe components could be calculated using either the eigendecomposition or SVD. Given that the eigendecomposition give different output from the SVD for the covariance matrix may yield different PCA output.
import numpy as np
import matplotlib.pyplot as plt
A=np.array([[1,3,2],
[4,8,2],
[3,9,7],
[22,11,17],
[55,33,66]])
A=A-np.mean(A,0)
print(np.dot(A.T,A)/(A.shape[0]-1))
co=np.cov(A.T)
print(co)
#%%
[D,UI]=np.linalg.eigh(co)
[aa,bb,cc]=np.linalg.svd(co)
print (UI)
print(aa)
UI
array([[-0.15348562, -0.77609875, -0.61164768],
[-0.85628748, 0.41338399, -0.30965372],
[ 0.49316722, 0.47621886, -0.72801215]])
aa
array([[-0.61164768, 0.77609875, -0.15348562],
[-0.30965372, -0.41338399, -0.85628748],
[-0.72801215, -0.47621886, 0.49316722]])
aa[:,::-1]
array([[-0.15348562, 0.77609875, -0.61164768],
[-0.85628748, -0.41338399, -0.30965372],
[ 0.49316722, -0.47621886, -0.72801215]])