In the frequentist paradigm, regression analysis in its most general form is given by:
$$y_i=E(y_i|X)+\epsilon_i$$
Where $E$ is the conditional expectation on $X$ and $X$ is some set of variables.
As far as I understand, this formula is essentially based on the fact that if we have a relation $$y_i=h(X)+\epsilon$$ then $$E(y_i|X)=\arg \max_h\{(y_i-h(X))^2\}$$
However, I think Bayesians would prefer not to think in terms of expectations, since these are not fundamental, and instead think of: $$P(y|X)$$ Which they would then analyse using Bayes' theorem and other things. It seems to me that the whole concept of an "error" as used in regression is alien to Bayesians, am I right?
So what do Bayesians think about the equation $y_i=E(y_i|X)+\epsilon_i$ ?