My intuition was that if an explanatory variable is independent of the response then in a multiple regression it should have a $\beta$ of zero.
Consider however the following very simple example: the distribution of $\left(Y,X_1,X_2\right)$ is multivariate normal, with a mean vector of $\mathbf{0}$, and a covariance matrix of $$\begin{pmatrix}10&2&0\\2&5&1\\0&1&1/2\end{pmatrix}.$$ Here the regression coefficients are $$\mathbf{\beta}=\begin{pmatrix}2&0\end{pmatrix}\begin{pmatrix}5&1\\1&1/2\end{pmatrix}^{-1}=\begin{pmatrix}\frac{2}{3}&-\frac{4}{3}\end{pmatrix},$$ i.e., $X_2$ has a non-zero $\beta$ despite being uncorrelated with $Y$, meaning independent from $Y$ in this case.
How can I image this?
I understand that this is a multivariate situation, so pairwise correlations are not conclusive (as the multivariate structure matters), but I thought that for multivariate normal, if I see a zero in the whole covariance matrix (and all variables are included in the regression) it just means that the $\beta$ needs to be zero.
Corollary question: if my intuition is not correct, then is the following statement true instead: ''In a multivariate normal model, a $\beta$ is zero iff the variable is uncorrelated with the response and it is also uncorrelated with all the remaining explanatory variables''...?
That's interesting, because it would mean that from the two conditions for omitted variable bias not to occur (the variable has a zero $\beta$ or it is uncorrelated with all the other variables) the first actually implies the second (in multivariate normal model, of course).