Yes (as per the derivation below - there are still things I am not clear about, e.g., what if each regressor explains more than 50% of the variance: can we then show they cannot be uncorrelated anymore, as $R^2$ can evidently not add up to more than one? I did play around with higher $R^2$s such as resulting from y <- 4*x1.c + 5*x2 + rnorm(n, sd=.01)
below, though, which suggests the problem does not occur. See also Thomas Lumley's helpful comment below!).
Also, the derivation demonstrates that the result goes through for general multiple regressions.
Let us suppose your regressions of $y$ on $X_1$ or $X_2$ or both contain a constant (if they do not, a similar result could be established for uncentered $R^2$).
By the FWL theorem, that is equivalent to regressing demeaned dependent variables on demeaned regressors.
Call these $\tilde y=y-\bar{y}$, $\tilde X_1=X_1-\bar{X}_1$ and $\tilde X_2=X_2-\bar{X}_2$, with suitably defined $\bar{X}_j$ for the column means of the regressor matrices. In the absence of correlation, we then have $\tilde X_1'\tilde X_2=0$.
The formula for $R^2$ is
$$
R^{2}:=1-\frac{e'e}{\tilde{y}'\tilde{y}}
$$
for a residual vector $e$. When regressing $\tilde y$ on, e.g., $\tilde X_1$, we have $e=M_{\tilde X_1}\tilde y$, with $M_{\tilde X_1}$ the usual residual maker matrix.
Hence, when adding the $R^2$s of the separate regressions, we have
$$
R_1^2+R_2^2=1-\frac{\tilde y'M_{\tilde X_1}\tilde y}{\tilde{y}'\tilde{y}}+1-\frac{\tilde y'M_{\tilde X_2}\tilde y}{\tilde{y}'\tilde{y}}=\frac{2\tilde y'\tilde y-\tilde y'M_{\tilde X_1}\tilde y-\tilde y'M_{\tilde X_2}\tilde y}{\tilde y'\tilde y}
$$
Using $\tilde y'M_{\tilde X_j}\tilde y=\tilde y'\tilde y-\tilde y'P_{\tilde X_j}\tilde y$, $j=1,2$, we obtain
$$
R_1^2+R_2^2=\frac{\tilde y'P_{\tilde X_1}\tilde y+\tilde y'P_{\tilde X_2}\tilde y}{\tilde y'\tilde y},
$$
with $P$ the projection matrices.
Next, consider the estimator of the joint regression of $\tilde{y}$ on both $\tilde{X}_1$ and $\tilde{X}_2$ when regressors are orthogonal:
$$
\begin{eqnarray*}
b&=&\left(%
\begin{array}{cc}
\tilde{X}_1'\tilde{X}_1 & 0 \\
0 & \tilde{X}_2'\tilde{X}_2 \\
\end{array}%
\right)^{-1}\left(%
\begin{array}{c}
\tilde{X}_1'\tilde{y} \\
\tilde{X}_2'\tilde{y} \\
\end{array}%
\right)\\
&=&
\left(%
\begin{array}{cc}
(\tilde{X}_1'\tilde{X}_1)^{-1} & 0 \\
0 & (\tilde{X}_2'\tilde{X}_2)^{-1} \\
\end{array}%
\right)\left(%
\begin{array}{c}
\tilde{X}_1'\tilde{y} \\
\tilde{X}_2'\tilde{y} \\
\end{array}%
\right)\\
&=&\left(%
\begin{array}{c}
(\tilde{X}_1'\tilde{X}_1)^{-1}\tilde{X}_1'\tilde{y} \\
(\tilde{X}_2'\tilde{X}_2)^{-1}\tilde{X}_2'\tilde{y} \\
\end{array}%
\right)
\end{eqnarray*}
$$
Thus, the residuals are
$$
\tilde{e}=\tilde{y}-(\tilde{X}_1:\tilde{X}_2)b=\tilde{y}-(P_{\tilde{X}_1}\tilde{y}+P_{\tilde{X}_2}\tilde{y}),
$$
so that, using idempotency of $P$ as well as $P_{\tilde{X}_1}P_{\tilde{X}_2}=0$,
$$
\tilde{e}'\tilde{e}=\tilde{y}'\tilde{y}-\tilde{y}'P_{\tilde{X}_1}\tilde{y}-\tilde{y}'P_{\tilde{X}_2}\tilde{y},
$$
so that
$$
R^2=\frac{\tilde{y}'\tilde{y}-\tilde{y}'\tilde{y}+\tilde{y}'P_{\tilde{X}_1}\tilde{y}+\tilde{y}'P_{\tilde{X}_2}}{\tilde{y}'\tilde{y}}
$$
Numerical illustration:
n <- 5
x1 <- rnorm(n)
x2 <- rnorm(n)
x1.c <- resid(lm(x1~x2)) # to get a regressor uncorrelated to x2
y <- rnorm(n)
Output:
> # centered case
> summary(lm(y~x1.c))$r.squared + summary(lm(y~x2))$r.squared
[1] 0.2187793
> summary(lm(y~x1.c+x2))$r.squared
[1] 0.2187793
> # uncentered case
> summary(lm(y~x1.c-1))$r.squared + summary(lm(y~x2-1))$r.squared
[1] 0.1250624
> summary(lm(y~x1.c+x2-1))$r.squared
[1] 0.1250624