You should not base conclusions you make about the estimates of regression coefficients on the predictive performance of your model. You can have a model with a low $R^{2}$ but that still produces an unbiased estimate of the relationship you are studying.
Whether a coefficient estimate is unbiased and represents the "true" relationship between $X_{1}$ and $Y$ depends on whether your model controlled for all confounding factors (variables that are correlated with both $Y$ and $X_{1}$ - such as years of work experience, years of schooling, and seniority of job position being confounding variables in the relationship between age and income).
On other hand, if you wanted to build a regression that predicts $Y$ well, it would be important to include variables correlated with $Y$ even if they are not correlated with $X_{1}$. With the income example, if you predicted income as a function of only the four variables I mentioned above, you would likely obtain a low $R^{2}$ because those variables do not explain all of the variation in income. The country or state you live in, the industry you work in, and even race and gender are good predictors of income, but those variables are probably not correlated with age, so they are not needed in a model whose goal is causal inference of the relationship between age and income.
(Also, $R^{2}$ and predictive performance can be a function not just of what predictors you include but what functional form your outcome/target variable takes.)