Background
There are various rules of linear combinations of integrated series. Let's just consider the $I(0)$ and $I(1)$ cases. For example, if $x_{t} \sim I(1)$, $y_{t} \sim I(0)$, then $ax_{t} + by_{t} \sim I(1)$. This example demonstrates $I(1)$ as a dominant property. Of course, there is the case in which it generally holds that if $x_{t}$ and $y_{t}$ are $I(1)$ then $ax_{t} + by_{t} \sim I(1)$, unless the series are cointegrated then there exists an $I(0)$ linear combination. Using these various rules, we can say that a stationary variable cannot be explained by a nonstationary variable and vice versa. For this reason, one often hears of balanced and unbalanced regressions; balanced meaning that the variables on each side of the regression are either both $I(0)$ or both $I(1)$ and unbalanced meaning that the variables are a mixture of $I(0)$ and $I(1)$.
Given that the above explains the bivariate case rather well, I'm curious to know how the theory applies to cases in which there are more than two variables. For example, is it appropriate to estimate an unbalanced regression such as $$ y_{t} = \alpha + \beta_{1}x_{t} + \beta_{2}z_{t} + \epsilon_{t} $$ where $y_{t} \sim I(1)$, $x_{t} \sim I(1)$, $z_{t} \sim I(0)$, and $\epsilon_{t}$ an error term suitably defined?
Question
Is the inclusion of the $I(0)$ variable in the above regression sound from a time-series perspective? Or, does it depend on the possible linear combinations of the variables in the regression, e.g. if $y_{t}$ and $x_{t}$ are cointegrated then the $I(0)$ variable fits into the regression without trouble?