This is in fact called multivariate regression. I just think that it's not a commonly-used term because it's not a commonly-used model. Note also that it is very easy to confuse with "multiple linear regression."
The reason why multivariate regression is so relatively unpopular (and not implemented explicitly in any major statistical package) is actually buried in the comments to one of the answers to Multivariate linear regression in R, which I'll repeat here:
User603's answer is correct. Given a model $Y=XB+E$ and assuming $E \sim N(0,\Sigma)$ (so you don't have a strictly diagonal covariance matrix) the maximum likelihood estimator for $B$ is simply $B_{OLS}=(X^TX)^{−1}X^TY$, which amounts in performing separate ordinary least squares estimates for each of the q response variables and does not depend on $\Sigma$. ($Σ$ appears as $Ω^{−1}$ in the literature sometimes, $Ω$ being the precision matrix)
This is why lm
is designed that way: because the coefficient estimates are actually equivalent.
Therefore differences arise only when you're trying to conduct statistical tests on the parameter estimates using their theoretical standard errors (as opposed to, say, bootstrapping), or if you're trying to estimate the distribution of model predictions (which with an improper flat prior are equivalent to posterior predictions).
If you're interested in making use of the correlation structure in the dependent variables, Breiman and Friedman (1997) [1] have a very interesting paper in which they develop something they call the "Curds & Whey" procedure for improving prediction accuracy in multivariate linear regression problems.
I also have some personal experience with these kinds of models, but they were unpleasant and mostly unfruitful. I attempted to directly fit one in Stan by specifying a multivariate normal error distribution for each data point. I didn't have a damn clue what I was doing at the time and I kept layering on extensions to the model, so it turned into a mess that wouldn't converge and I dropped it entirely. However I think there is some merit to the basic idea and I'm be tempted to try it again at some point.
[1]: Breiman, L. and Friedman, J. (1997). Predicting Multivariate Responses in Multiple Linear Regression. Journal of the Royal Statistical Society, 59(1), 3-54. Available (gated) at: http://onlinelibrary.wiley.com/doi/10.1111/1467-9868.00054/pdf. Available (free) at: you know where to look