It seems to me that the following is the mathematically simplest way to partial-out variables from a correlated set of items.
Consider a correlation matrix R for 5 items, where we want to "partial-out" the first two variables. This is the initial correlation-matrix:
$$ \text{ R =} \small \begin{bmatrix} \begin{array} {r}
1.00& -0.15& 0.27& 0.53& 0.24\\
-0.15& 1.00& -0.09& -0.50& -0.34\\
0.27& -0.09& 1.00& 0.22& 0.19\\
0.53& -0.50& 0.22& 1.00& 0.47\\
0.24& -0.34& 0.19& 0.47& 1.00 \end{array}
\end{bmatrix} $$
Now we want to partial out the first item. We determine the vector of correlations of all variables with it, this gives the vector $f_1$ (which is just the first column of
R :
$$ f_1 = \small \begin{bmatrix} \begin{array} {r}
1.00\\
-0.15\\
0.27\\
0.53\\
0.24
\end{array} \end{bmatrix}
$$
Then build the matrix $R_1 = f_1 \cdot f_1^\tau$
$$ \text{ R}_1 =\small \begin{bmatrix} \begin{array} {rrrrr}
1.00& -0.15& 0.27& 0.53& 0.24\\
-0.15& 0.02& -0.04& -0.08& -0.04\\
0.27& -0.04& 0.07& 0.14& 0.06\\
0.53& -0.08& 0.14& 0.28& 0.12\\
0.24& -0.04& 0.06& 0.12& 0.06
\end{array} \end{bmatrix}
$$ and subtract this from the original matrix to get $R_{ \; \cdot 1}$
$$ \text{ R}_{\ \cdot 1} =\small \begin{bmatrix} \begin{array} {rrrrr}
0.00& 0.00& 0.00& 0.00& 0.00\\
0.00& 0.98& -0.05& -0.42& -0.30\\
0.00& -0.05& 0.93& 0.07& 0.13\\
0.00& -0.42& 0.07& 0.72& 0.35\\
0.00& -0.30& 0.13& 0.35& 0.94
\end{array}
\end{bmatrix}
$$
Now we look at the partial vector $f_{2 \cdot 1}$. First, we get just from extraction of the second column of the remaining covariance matrix. In order to have the entry in its second row such that then $R_{2 \cdot 1} = f_{2 \cdot 1} \cdot f_{2 \cdot 1}^\tau$ has the correct value in row and column 2 we must define $f_{2 \cdot 1} = f_{2 \cdot 1} / \sqrt{ f_{2 \cdot 1}[2]}$, thus we get:
$$ f_{2 \cdot 1}= \small \begin{bmatrix} \begin{array} {r}
0.00\\
0.99\\
-0.05\\
-0.42\\
-0.31
\end{array} \end{bmatrix} $$
Then $ \text{ R }_{2 \cdot 1} = f_{2 \cdot 1} \cdot f_{2 \cdot 1}^\tau $ and we find
$$ \text{ R }_{2 \cdot 1} = \small \begin{bmatrix} \begin{array} {rrrrr}
0.00& 0.00& 0.00& 0.00& 0.00\\
0.00& 0.98& -0.05& -0.42& -0.30\\
0.00& -0.05& 0.00& 0.02& 0.01\\
0.00& -0.42& 0.02& 0.18& 0.13\\
0.00& -0.30& 0.01& 0.13& 0.09
\end{array}
\end{bmatrix} $$
and after removing that covariance as well by $ \text{ R }_{ \cdot 12}= \text{ R }_{ \cdot 1}- \text{ R }_{ 2\cdot 1} $ we get
$$ \text{ R }_{ \cdot 12} =\small \begin{bmatrix} \begin{array} {rrrrr}
0.00& 0.00& 0.00& 0.00& 0.00\\
0.00& 0.00& 0.00& 0.00& 0.00\\
0.00& 0.00& 0.93& 0.05& 0.11\\
0.00& 0.00& 0.05& 0.54& 0.22\\
0.00& 0.00& 0.11& 0.22& 0.85
\end{array}
\end{bmatrix} $$
This can be iterated for the next variable(s) to be partialled out analoguously. You can then analyze the remaining nonzero-part as covariances, which are the "partial correlations" when the "partialled-out" variables are, so-to-say, "held constant".