6

I am confused while trying to find a general expression for the mean and variance of a stationary VAR model. I am trying to do it for VAR(1). I also can't find it in the literature. Can anyone help me?

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
allo
  • 91
  • 1
  • 3

2 Answers2

8

Taking the variance of both sides of the equation $$ y_t = \nu + A_1 y_{t-1} + u_t $$ leads to $$ \operatorname{Var}y_t = A_1\operatorname{Var}y_{t-1}A_1^T+\Sigma_u. $$ Stationary implies that $\operatorname{Var}y_t =\operatorname{Var}y_{t-1}=\Gamma_0$ so you need to solve the matrix equation $$ \Gamma_0 = A_1\Gamma_0 A_1^T+\Sigma_u. $$ Applying the vec-function, this can be rewritten (see wikipedia) as $$ \operatorname{vec}\Gamma_0 = (A_1\otimes A_1) \operatorname{vec}\Gamma_0 + \operatorname{vec}\Sigma_u $$ and solved using standard methods for the unknown covariances given by $$ \operatorname{vec}\Gamma_0 = (I-A_1\otimes A_1)^{-1} \operatorname{vec}\Sigma_u. $$ So you don't need to work out the infinite sum from the MA$(\infty)$-representation.

Jarle Tufto
  • 7,989
  • 1
  • 20
  • 36
  • 1
    If I understand correctly, the result is the same but your representation is more convenient as it readily yields an empirically feasible solution (unlike the infinite sum). – Richard Hardy Dec 15 '16 at 10:19
  • Yes, it's just two different ways of expressing the same covariance matrix. – Jarle Tufto Dec 15 '16 at 10:21
  • Thanks for the answer. Quite what I was looking for. Not sure why the answer is not accepted. It should be. – Xbel Jul 27 '21 at 07:17
5

According to Lütkepohl (2005), p. 14-15, if we have a $K$-variate VAR(1) process of the form $$ y_t = \nu + A_1 y_{t-1} + u_t, $$ then the unconditional mean is $$ (I_K-A_1)^{-1}\nu $$ (where $I_K$ is an identity matrix of dimension $K\times K$) and the unconditional covariance for lag $h$ (i.e. $\text{Cov}(y_t,y_{t-h})$) is $$ \sum_{i=0}^\infty A_1^{h+i}\Sigma_u {A_1^i}' $$ where $\Sigma_u$ is the covariance matrix of the error term $u_t$. Then the unconditional variance can be obtained by taking $h=0$ in the above expression.

The same applies to VAR($p$) after having expressed the process in its alternative $Kp$-dimensional VAR(1) representation.

These results are obtained using the vector moving-average (VMA) representation of the VAR(1) process.

References

  • Lütkepohl, Helmut. New Introduction to Multiple Time Series Analysis. Springer Science & Business Media, 2005.
Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
  • Is there a way to derive the unconditional variance? – Bonsaibubble Jan 28 '17 at 09:22
  • @Bonsaibubble, the answer by Jarle Tufto does that. – Richard Hardy Jan 28 '17 at 09:51
  • But how does this correspond to what Lütkepohl defines? – Bonsaibubble Jan 28 '17 at 10:07
  • @Bonsaibubble, both answers agree on the substance (they do not imply different things), they just use different approach and notation. – Richard Hardy Jan 28 '17 at 10:09
  • Ok, I dont understand Tufto's approach then – Bonsaibubble Jan 28 '17 at 10:13
  • Is a proof of Lütkepohl approach available somewhere? – Bonsaibubble Jan 28 '17 at 10:16
  • @Bonsaibubble, p. 15 in Lütkepohl's textbook has it. – Richard Hardy Jan 28 '17 at 10:46
  • It defines the mean vector but a proof is not given – Bonsaibubble Jan 28 '17 at 11:02
  • It is not given in detail, but there are pointers to Appendices etc. that should make up for that. But if you cannot follow it, then of course another source would be more helpful. – Richard Hardy Jan 28 '17 at 11:05
  • @RichardHardy I don't own the book you are referring to. Could you define $h$? Why does $h=0$ correspond to the unconditional variance. What does it correspond to when $h \neq 0$? Many thanks – Hunter Aug 07 '19 at 08:17
  • @Hunter, I have updated my answer. $h$ is the time lag of autocovariance of $y_t$. A lag of $h=0$ corresponds to covariance of $y_t$ with itself (i.e. variance), while a lag of $h \neq 0$ corresponds to covariance between $y_t$ and $y_{t-h}$. – Richard Hardy Aug 07 '19 at 10:19
  • @RichardHardy thanks, makes a lot of sense now. – Hunter Aug 07 '19 at 10:28
  • @RichardHardy Hi Richard, I am a big fan of your answers, including this old [one](https://stats.stackexchange.com/questions/198844/high-level-overview-of-auto-arima-with-xreg-predictors) that you have about Arimax and regressions with time-series errors. I am dealing with a regression with Garch Error term and searching for good sources because I have some doubts on the way I have to write the likelihood function.. Arimax models are ok as an alternative to a regression with Garch errors.. do you know any? – Fr1 Aug 07 '19 at 10:43
  • @RichardHardy as you may see from my yesterday question [here](https://stats.stackexchange.com/questions/420946/writing-the-likelihood-and-conditional-variance-in-a-armax-model-or-regression-w) I got a bit confused yesterday on the way I have to write the conditional variance of the Error term in the likelihood – Fr1 Aug 07 '19 at 10:45
  • @Fr1, thanks for your kind words. I am rather busy these days and have not had time to check out your question yet. If you ping me again on Sunday or early next week, I will see if I could help you then. – Richard Hardy Aug 07 '19 at 11:01
  • @RichardHardy ok many thanks in advance! – Fr1 Aug 07 '19 at 11:02
  • @Fr1, just don't take it as a promise I will necessarily solve your problem. Bye until then. – Richard Hardy Aug 07 '19 at 12:11