I am currently investigating the relationship between S&P 500 Spot and Future prices. Thes series contains daily closing prices of the past 4 years. Checking the residuals via a regression of the form Spot ~ Futures, tests show evidence of stationarity. Also the Johansen-Procedure confirms at least one cointegration relationship. I now constructed a VECM but i discover that when i include too many lags (above 4 lags) the ECT (error-correction-term) becomes insignificant. So the VECM essentially becomes a VAR. I also fitted a VAR to the data which yields a higher R^2 than the VECM. Tests for causality via the VAR are strongly significant for both so one cannot conclude that only one causes the other (which means that both are endogenous, am i correct?).
So my questions are:
Do you have any idea why the ECT is becoming more and more insignificant the more lags i include (after 3 lags it becomes insignificant when i choose 2OLS as estimation method and after 5 when using ML)?
Shall i then prefer the VAR over the VECM, even though i am interested in the long-run relationship (but which becomes insignificant when including more than 5 lags)?
Any help here is highly appreciated. I also hope that my request is not too trivial. I am not that experienced in the field of time series analytics.
Thanks for your support!