3

One of the important OLS assumptions is a strict exogeneity assumption, i.e. $E(\epsilon_i | X) = 0, \forall i$. I'm interested in testing empirically this hypothesis, notably in the context of time series.

It is known that exogeneity is rarely true in time series but assuming that the model is well specified, e.g. contains no lagged dependent variables, is there a formal test (in R) that could indicate there's a problem? What is the usual approach to testing this assumption?

dwolfeu
  • 454
  • 1
  • 4
  • 13
johnny
  • 628
  • 8
  • 13

1 Answers1

4

To test for any kind of exogeneity, you would have to show that there is no variable in the world that is correlated both with your outcome and any included variable. You probably don't include these variables in your model because you don't have that data. This implies that you can't test the proposition.

There are some tests for exogeneity (e.g., the Hausman test), but they require strong assumption and additional data. Even given that, I'm not especially enamored with them.

Charlie
  • 13,124
  • 5
  • 38
  • 68
  • Thanks for your reply. So it means that exogeneity should basically be justified as a theoretical argument? Meaning that it should be very rarely assumed in time series. – johnny Mar 07 '12 at 09:37
  • Yes, it has to be justified theoretically, but it is still an assumption that is made frequently. You just need to understand that it's a limitation of your analysis and should consider whether there are any obvious reasons why it would not hold. – Charlie Mar 07 '12 at 14:48
  • My understanding is that this kind of thing you try your best to prove false by looking for correllations. After all E[E[eX|X]] = E[XE[e|X]] =0 and E[e] = E[E[e|x]] = 0 so strict exogenentiy implies nothing is correlated with the error. – Daniel Parry Sep 24 '14 at 17:34