0

Given a $M \times N$ data matrix $ D = (x_1, x_2, \cdots, x_M)^{T}$.

Applying singular value decomposition to $D$ yields $$D = USV^{T} = (u_1, u_2, \cdots, u_M) \begin{pmatrix} s_1 &&0\\ &\ddots\\ 0& & s_N \\ \hline &0 \end{pmatrix}_{M\times N} (v_1, v_2, \cdots, v_N)^{T} $$

Choose $d$ singular values, we have

$$D' = (u_1, u_2, \cdots, u_M) \begin{pmatrix} s_1 & &\\ & \ddots &\\ & & s_d\\ \hline & 0 \end{pmatrix}_{M \times d} = (f_1, f_2, \cdots, f_M)^{T}$$ in which $D'$ is an $M \times d$ matrix.

My question is:

What is the relation between $D'$ and $D$? To be specific, can $f_i$ be interpreted as a reduced version of $x_i$? If so, then where is the effect of the orthogonal matrix $V$?

For me, SVD can be used for tuning the "resolution" of a matrix in another way.

Given $D = USV^{T} = \sum_iu_is_iv_i^{T}$, thus the more singular values we take into account, the better the matrix will be reconstructed.

meTchaikovsky
  • 1,414
  • 1
  • 9
  • 23
  • Are you looking for the Eckhart-Young theorem? – Sycorax Jul 08 '18 at 02:04
  • @Sycorax I don't know the theorem, so I'm not sure. I just read a paper, in the paper, the original data matrix $D$ is transformed into $D'$ by using SVD as I explained. Then for the resulting $D'$, PCA is applied. So, I think SVD is used as a dimensionality reduction technique. – meTchaikovsky Jul 08 '18 at 02:09

0 Answers0