Context
This question emerged from trying to solve problem 5.1. of Wooldridge, Econometric Analysis of Cross Section and Panel Data.
The problem asks to show the equivalence of the estimators obtained via a 2-stage least square procedure and the estimators obtained via an OLS regression to which the residuals of the auxiliary regression from the 2SLS are added as regressors.
In an attempt to solve the problem I stumbled upon the question below. I found another way to solve the problem which does not require an answer to the following question, but I am still interested in the question itself. Here it goes.
Question
Suppose the column vectors of the matrix $X$ form the basis of some space. Decompose $X$ into two sub matrices
$$ X = ( X_1~X_2) .$$
Assume that the the vector $y$ is orthogonal to every column vector of $X_1$. Suppose that we want to project $y$ onto the space orthogonal to the column space of $X$ (i.e. the null-space of $X'$). This yields the vector
$$ p =( I - X(X'X)^{-1} X') y.$$
I have a strong geometrical intuition that because $y$ is already orthogonal to the vectors in $X_1$, this is the same as projecting $p$ onto the space orthogonal to the column space of $X_2$ ( i.e. the null-space of $X_2'$). Formally this would mean
$$ ( I - X(X'X)^{-1} X') y = ( I - X_2(X_2'X_2)^{-1} X_2') y. $$
I found some examples where it works, but I could not find an algebraic proof or a counter-example (I tried using decompositions of $(X'X)^{-1}$ into terms depending only on $X_1$ and $X_2$ but to no avail). Is this true? Any idea how to prove or disprove this algebraically?