Let us first recall the test statistics:
Wald test in the linear regression model
For $\mathcal{W}$ we need an estimator of the southeast block of the variance-covariance matrix of the coefficients,
\begin{eqnarray}
\widehat{V}_{\mathcal{W}}&=&\left[\mathcal{I}_{22}(\widehat{\theta})-\mathcal{I}_{21}(\widehat{\theta})\mathcal{I}_{11}(\widehat{\theta})^{-1}\mathcal{I}_{12}(\widehat{\theta})\right]^{-1}\notag\\
&=&\left[\frac{1}{n\widehat{\sigma}^2}[X_2'X_2-X_2'X_1(X_1'X_1)^{-1}X_1'X_2]\right]^{-1}\notag\\
&=&n\widehat{\sigma}^2\left[X_2'M_{X_1}X_2\right]^{-1}\label{vw}
\end{eqnarray}
Hence, by FWL in line 3,
\begin{eqnarray*}
\mathcal{W}&=&n\widehat{\beta}_2'\left[n\widehat{\sigma}^2\left[X_2'M_{X_1}X_2\right]^{-1}\right]^{-1}\widehat{\beta}_2\\
&=&\frac{\widehat{\beta}_2'X_2'M_{X_1}X_2\widehat{\beta}_2}{\widehat{\sigma}^2}\\
&=&\frac{y'M_{X_1}X_2(X_2'M_{X_1}X_2)^{-1}X_2'M_{X_1}X_2(X_2'M_{X_1}X_2)^{-1}X_2'M_{X_1}y}{\widehat{\sigma}^2}\\
&=&\frac{y'M_{X_1}X_2(X_2'M_{X_1}X_2)^{-1}X_2'M_{X_1}y}{\widehat{\sigma}^2}\\
&=&\frac{y'P_{M_{X_1}X_2}y}{\widehat{\sigma}^2}\\
&=:&\frac{y'P_{X_{2\bot1}}y}{\widehat{\sigma}^2}\\
&=&n\frac{y'P_{X_{2\bot1}}y}{y'(I-P_{X})y}
\end{eqnarray*}
Notice we use the ML estimator of the error variance, $\widehat{\sigma}^2=1/ny'(I-P_{X})y$, not the unbiased estimator that corrects for degrees of freedom.
Score test in the linear regression model
For the score statistic we need the average score evaluated at $\widehat{\theta}_R$
\begin{eqnarray}
E_n[L_{\theta_2}(\widehat{\theta}_R)]&=&\frac{1}{\widehat{\sigma}^2_Rn}X'_2(y-X\widehat{\beta}_R)\notag\\
&=&\frac{1}{\widehat{\sigma}^2_Rn}X'_2(y-X_1\widehat{\beta}_{R1}-X_20)\notag\\
&=&\frac{1}{\widehat{\sigma}^2_Rn}X'_2(y-X_1\widehat{\beta}_{R1})\notag\\
&=&\frac{1}{\widehat{\sigma}^2_Rn}X'_2M_{X_1}y\label{scorelinreg2}
\end{eqnarray}
For the estimated variance of the score in the score statistic we obtain, analogously to the Wald case,
\begin{equation}\label{scorevar}
\widehat{V}_{\mathcal{S}}=\frac{X_2'M_{X_1}X_2}{n\widehat{\sigma}^2_R}
\end{equation}
Putting together these two expressions the score test statistic becomes
\begin{eqnarray}
\mathcal{S}&=&n\frac{1}{\widehat{\sigma}^2_Rn}y'M_{X_1}X_2\widehat{\sigma}^2_Rn[X_2'M_{X_1}X_2]^{-1}\frac{1}{\widehat{\sigma}^2_Rn}X'_2M_{X_1}y\notag\\
&=&y'M_{X_1}X_2[X_2'M_{X_1}X_2]^{-1}\frac{1}{\widehat{\sigma}^2_R}X'_2M_{X_1}y\notag\\
&=&\frac{y'P_{X_{2\bot1}}y}{\widehat{\sigma}^2_R}\notag\\
&=&n\frac{y'P_{X_{2\bot1}}y}{y'(I-P_{X_1})y},\label{scorelinreg3}
\end{eqnarray}
where the last row follows from the definition of the estimated restricted error variance,
$$e_R'e_R=y'M_{X_1}y\quad\text{ and }\quad
M_{X_1}=I-X_1(X_1'X_1)^{-1}X_1'.$$
Likelihood ratio test in the linear regression model
Inserting the restricted and unrestricted estimator into the sample log-likelihood yields, using
\begin{eqnarray*}
E_n[L(\widehat{\theta})]&=&-\frac{1}{2}\log\left(2\pi\frac{(y-X\widehat{\beta})'(y-X\widehat{\beta})}{n}\right)-\frac{(y-X\widehat{\beta})'(y-X\widehat{\beta})/n}{2(y-X\widehat{\beta})'(y-X\widehat{\beta})/n}\\
&=&-\frac{1}{2}\left[\log\left(2\pi\frac{(y-X\widehat{\beta})'(y-X\widehat{\beta})}{n}\right)+1\right],
\end{eqnarray*}
and analogously for $E_n[L(\widehat{\theta}_R)]$, the following expression for the $\mathcal{L}\mathcal{R}$-test statistic:
\begin{eqnarray}\mathcal{L}\mathcal{R}&=&-n\left\{\log\left[\frac{2\pi(y-X\widehat{\beta})'(y-X\widehat{\beta})}{n}\right]+1\right\}\notag\\&&
+n\,\left\{\log\left[\frac{2\pi(y-X\widehat{\beta}_R)'(y-X\widehat{\beta}_R)}{n}\right]+1\right\}\notag\\
&=&n\log\left[\frac{(y-X\widehat{\beta}_R)'(y-X\widehat{\beta}_R)}{(y-X\widehat{\beta})'(y-X\widehat{\beta})}\right]\label{lrlinregml}
\end{eqnarray}
Theorem:
The classical tests of $$H_0:\beta_{02}=0$$ satisfy $$\mathcal{W}\geqslant\mathcal{L}\mathcal{R}\geqslant\mathcal{S}$$ in the conditionally normal linear regression model.
Proof:
As an intermediate result, we show that the test statistics can be written as follows.
\begin{eqnarray}
\mathcal{S}&=&n\frac{y'(I-P_{X_1})y-y'(I-P_{X})y}{y'(I-P_{X_1})y}\label{scoreproj}\\
\mathcal{L}\mathcal{R}&=&n\log\frac{y'(I-P_{X_1})y}{y'(I-P_{X})y}\label{lrproj}\\
\mathcal{W}&=&n\frac{y'(I-P_{X_1})y-y'(I-P_{X})y}{y'(I-P_{X})y}\label{waldproj}
\end{eqnarray}
The numerator of the score test statistic results as follows. We first show that
$$
P_{X}=P_{X_1}+P_{X_{2\bot1}},
$$
as a partition of $X$, $$X=(X_{A}\vdots X_{B}),$$ in orthogonal matrices $X_{A}$, $X_{B}$ ($X_{A}'X_{B}=0$) satisfies that (see here for why)
$$
P_{X}=P_{A}+P_{B}
$$
We can apply this result to $X_1$ and $X_{2\bot1}$, as $X_{2\bot1}'X_1=0$. Hence, $$P_{X_{2\bot1}}=P_{X}-P_{X_1}.$$ Adding and subtracting $y'Iy$ in the first expression of the score statistic above yields the numerator of $\mathcal{S}$ in the theorem. The Wald statistic follows completely analogously, with the corresponding estimator of the error variance. Finally, the numerator in the likelihood ratio statistic is the denominator of the score statistic; the denominator is the denominator of the Wald statistic.
The claim now follows with the bound $$\log x\leqslant x-1.$$ Apply this to
$$x:=\frac{y'(I-P_{X_1})y}{y'(I-P_{X})y}$$ to get
$$\mathcal{W}/n\geqslant\mathcal{L}\mathcal{R}/n\Rightarrow\mathcal{W}\geqslant\mathcal{L}\mathcal{R}.$$
The bound can also be written as $$1-x\leqslant -\log x.$$ Let
$$
x:=\frac{y'(I-P_{X})y}{y'(I-P_{X_1})y}
$$
Then,
\begin{eqnarray*}
\frac{\mathcal{S}}{n}=1-x&\leqslant&-\log\left[\frac{y'(I-P_{X})y}{y'(I-P_{X_1})y}\right]\\
&=&\log\left[\frac{y'(I-P_{X})y}{y'(I-P_{X_1})y}\right]^{-1}\\
&=&\log\left[\frac{y'(I-P_{X_1})y}{y'(I-P_{X})y}\right]\\
&=&\frac{\mathcal{L}\mathcal{R}}{n}
\end{eqnarray*}