One of the important OLS assumptions is a strict exogeneity assumption, i.e. $E(\epsilon_i | X) = 0, \forall i$. I'm interested in testing empirically this hypothesis, notably in the context of time series.
It is known that exogeneity is rarely true in time series but assuming that the model is well specified, e.g. contains no lagged dependent variables, is there a formal test (in R) that could indicate there's a problem? What is the usual approach to testing this assumption?