Consider the regression equations below:
\begin{align} Y_i &= \beta_0 + \beta_1 X_{i1} + \varepsilon_i \\ Y_j &= \beta_0 + \beta_1 X_{j1} + \beta_2 X_{j2} + \varepsilon_j \end{align}
where $Y_i,\ X_{i1},\ \varepsilon_i,\ Y_j,\ X_{j1},\ \& \ X_{j2},\ \varepsilon_j$ are vectors, and $_i$ and $_j$ index distinct sets of observations. The $_i$ respondents did not meet a qualification criterion and hence were not asked the question that corresponds to $X_2$.
The dependent variable and the first independent variable is the same in both regression equations but the second regression equation has an independent variable that is not present in the first. Obviously, I can estimate the two regressions separately but that will not be efficient. Therefore, I was considering re-writing the first one as:
$$ Y_i = \beta_0 + \beta_1 X_{i1} + \beta_2 X_{i2} + \varepsilon_j $$
where $X_{i2}$ is a vector of $0$s.
Then I can estimate the parameter estimates by using OLS with the equation below:
$\left[ \begin{array}{ccc} Y_i\\ Y_j\end{array} \right] = \left[ \begin{array}{ccc} {\bf 1} & X_{i1} & X_{i2} \\ {\bf 1} & X_{j1} & X_{j2}\end{array} \right] \left[ \begin{array}{ccc} \beta_0\\ \beta_1\\ \beta_2\end{array} \right] + \left[ \begin{array}{ccc} \epsilon_i\\ \epsilon_j\end{array} \right]$
In the above equation, ${\bf 1}$ stands for a vector of $1$s of the appropriate dimension.
Is the above a standard approach to obtaining efficient estimates? Is there a name to this way of estimation?