0

I have a few variables, $A$, $B$, $C$, $D$, and $E$. To find their cointegration coefficients, $A$ is regressed against $B$, $C$, $D$, and $E$. $$ A = W_b * B + W_c * C + W_d * D + W_e * E + W0 $$ (where $W_b$, $W_c$, $W_d$, $W_e$ are cointegration coefficients and $W_0$ is intercept). The residuals are then tested for stationarity.

The problem is that the cointegration coefficients are sensitive to the ordering of the variables. For example, if $B$ is regressed against $A$, $C$, $D$, $E$ I get a totally different set of coefficients.

How do I solve this problem?

Can I create a new variable, $X$ which has a constant value (either 0 or nonzero) and regress it against $A$, $B$, $C$, $D$, $E$? $$ X = W_a * A + W_b * B + W_c * C + W_d * D + W_e * E + W0 $$ Or should $X$ be a trending vector? (1,2,3,4,5,6,7,8......)

Can introduction of an artificial variable solve my problem or will it create more problems such as spurious regression?

gung - Reinstate Monica
  • 132,789
  • 81
  • 357
  • 650
  • So far as I can see, this is not about the *order* of the variables. $A=W_bB+W_cC+\cdots$ and $A=W_cC+W_bB+\cdots$ are different orderings. You're talking about using a different *response variable*. Although regressing A on B, & B on A may look similar, they are not. I discussed this issue [here](http://stats.stackexchange.com/questions/22718/what-is-the-difference-between-doing-linear-regression-on-y-with-x-versus-x-with/22721#22721). – gung - Reinstate Monica Jul 27 '12 at 03:45
  • Gung's point is easy to recognize when you compare simple regression where in one case you regress B on A compared to one where you regress A on B. If the noise component is very small you would expect one coefficient to be almost the reciprocal of the other. But even with a moderate noise component this will not be the case. Remember implicit to least squares is the assumption that the IV(s) are measured without error and all the error is due to an additive error in the DV. If you address the error in variables problem you will get different coefficients than OLS. – Michael R. Chernick Jul 27 '12 at 08:52
  • @gung yes, you are right. I'm talking about regressing A on B vs regressing B on A. The results are different. Can I solve this problem by introducing an artificial variable X as a response variable and regressing X on A and B? X = Wa A + Wb B + W0. – user1437139 Jul 27 '12 at 16:07
  • Not that I know of; they're apples & oranges. If you haven't yet, you should read the answer I linked to before. You need to decide which of those is the question you want to answer & go w/ that 1. I should also say that I know relatively less about time-series, but if I remember correctly, cointegration is related to vector autoregression in which several models are fit *simultaneously*. That may be closer to the question you want to answer, but someone else will have to explain how these things are related to each other. – gung - Reinstate Monica Jul 27 '12 at 16:21

0 Answers0