Assume you have a regression with two regressors (plus a constant),
$$y_i = a + b_1x_{1i}+b_2x_{2i} + u_i$$
Then if you work the normal equations (a bit tedious) you will find that
$$\hat b_1 = \frac {\operatorname {\hat Var}(X_2)\cdot \operatorname {\hat Cov}(Y,X_1) - \operatorname {\hat Cov}(X_1,X_2)\cdot \operatorname {\hat Cov}(Y,X_2)}{\operatorname {\hat Var}(X_1)\cdot\operatorname {\hat Var}(X_2)\cdot [1-\hat \rho_{1,2}^2]} $$
where the hat indicates sample variances (without the bias correction term), and covariances, and $\hat \rho_{1,2}$ is the sample correlation coefficient between the two regressors.
The denominator is always positive, so the sign of the estimated coefficient depends on the numerator. Then if you have (for example)
$$0 < \operatorname {\hat Cov}(Y,X_1) < \frac {\operatorname {\hat Cov}(X_1,X_2)\cdot \operatorname {\hat Cov}(Y,X_2)}{\operatorname {\hat Var}(X_2)}$$
which is perfectly possible, then you will have positive pair-wise correlation between the dependent variable and regressor $X_1$, but negative coefficient of this regressor in the context of multiple regression. In other words, if one examines the dependent variable and regressor $X_1$ alone, they tend to move together (i.e. one will obtain a positive coefficient in the context of simple regression), but if regressor $X_2$ is present the marginal effect of $X_1$ on the dependent variable emerges as negative. This is an instance of the famous "sign reversal paradox", which is not really a paradox. Intuition (for this case)? If $X_2$ strongly correlates positively with the dependent variable and $X_1$, then in the simple regression the apparent positive relation between $Y$ and $X_1$ is due to the underlying effect of $X_2$ which is absent. When $X_2$ enters the specification, it takes on this positive effect, and "reveals" that the "pure" effect of $X_1$ on the dependent variable is, after all, negative. Note that the simple regression here would constitute a case of "omitted variables bias" in the estimation, since $X_2$ does belong to the specification.
In some fields, $X_2$ is called a "confounder". Analogous results hold of course for the other coefficient, or for more than two regressors.
See also
Positive correlation and negative regressor coefficient sign