Given a $M \times N$ data matrix $ D = (x_1, x_2, \cdots, x_M)^{T}$.
Applying singular value decomposition to $D$ yields $$D = USV^{T} = (u_1, u_2, \cdots, u_M) \begin{pmatrix} s_1 &&0\\ &\ddots\\ 0& & s_N \\ \hline &0 \end{pmatrix}_{M\times N} (v_1, v_2, \cdots, v_N)^{T} $$
Choose $d$ singular values, we have
$$D' = (u_1, u_2, \cdots, u_M) \begin{pmatrix} s_1 & &\\ & \ddots &\\ & & s_d\\ \hline & 0 \end{pmatrix}_{M \times d} = (f_1, f_2, \cdots, f_M)^{T}$$ in which $D'$ is an $M \times d$ matrix.
My question is:
What is the relation between $D'$ and $D$? To be specific, can $f_i$ be interpreted as a reduced version of $x_i$? If so, then where is the effect of the orthogonal matrix $V$?
For me, SVD can be used for tuning the "resolution" of a matrix in another way.
Given $D = USV^{T} = \sum_iu_is_iv_i^{T}$, thus the more singular values we take into account, the better the matrix will be reconstructed.