It's somewhat related to a question I posed earlier Different usage of the term "Bias" in stats/machine learning regarding the various usages of "bias."
I was the following questions a couple of months ago:
In simple linear regression, we have $Y = WX + b + e$ where $e$ is the standard normal error. What affect does $e$ have on the bias of the model? What if instead we have $Y = W(X + e) + b$?
When I was asked this question, I assumed they were asking about "bias" in terms of "bias"-variance tradeoff.
I know that the bias is an estimate is defined as
$$ \left( E[\hat{f}(x)] - f(x) \right)^2 $$
where $f$ is the true unobserved model and $\hat{f}$ is the model derived using linear regression. But $f$ and $\hat{f}$ aren't affected by the unobserved error $e$, so it seems in both situations, the unobserved error has no affect on the bias?