I've got a problem where
$$y = a + b $$
I observe y, but neither $a$ nor $b$. I want to estimate
$$b = f(x) + \epsilon$$
I can estimate $a$, using some sort of regression model. This gives me $\hat b$. I could then estimate
$$\hat b = f(x) + \epsilon$$
First problem: a regression model to predict $a$ could lead to $\hat b$ being negative, which wouldn't make any sense. Not sure how to get around this (not the sort of problem I've dealt with a lot) but seems like the kind of thing that others deal with routinely. Some sort of non-gaussian GLM?
The main problem is how to account for the uncertainty in the main model that comes from estimating $\hat b$. I've used multiple imputation before for missing covariates. But this is a missing "latent parameter." Alternatively, it is outcome data, which seems OK to impute. However I often hear of EM used for "latent" parameters. I am not sure why, nor do i know whether EM is any better in these contexts. MI is intuitive both to understand, implement, and communicate. EM is intuitive to understand, but seems more difficult to implement (and I haven't done it).
Is EM superior for the sort of problem I've got above? If so, why? Second, how does one implement it in R for a linear model, or for a semiparametric (GAM) model?