Let $X$ be a $14\times5$ matrix of 14 data points with 5 factor values per point. After subtracting the column means from each matrix element in the respective column, we are left with the zero-column-centered matrix $X_0$.
The eigenvectors of the $5\times 5$ covariance matrix $X_0^TX_0$ are the principal components, and as you indicate, they are also the column vectors of $ 5\times 5 $ matrix $V$ arising from the singular value decomposition of $X$, viz., $X=U\Sigma V^T.$
Your data came to you in the original basis, which we make explicit as follows $X_0=X_0^{orig}.$ Now you seek to write your zero-centered $X_0^{orig}$ in the PCA basis $X_0^{PCA}$. The change of basis is accomplished by
$$
\begin{aligned}
X_0^{PCA} &= (V^T X_0^{orig,T})^T = X_0^{orig} V \\
(14\times5)&= ((5 \times 5) (5 \times 14))^T= (14\times5)(5\times5).\\
\end{aligned}
$$
That is how you project the data matrix onto all five principal components and report the projections in the PCA basis.
If you want to project the original matrix $X$ into the first two principal components and leave it in the original basis, you simply set the eigenvalues associated with the other components to zero.
$$
\Sigma' =
\begin{cases}
\Sigma_{ii}, &i=1,2 \\
0, &i=3,4,5 \\
\end{cases}
$$
So SVD will deliver the projected data
$$
X'_0 = U\Sigma'V^T.
$$
This reconstructs $X'_0$. To return it to the original matrix, don't forget to add the means back to the columns!
This was all nicely discussed in @amoeba's link, which I just noticed. :)