7

I have two questions:

1) When one says an ARMA process is 'stationary,' do they mean strongly stationary or weakly stationary?

2) Is there a quick way to find the variance of a stationary AR(2) model $$y_t = \beta_1 y_{t-1} + \beta_2 y_{t-2} + \epsilon_t?$$ The only way I can think of doing this is by multiplying by $y_t$, $y_{t-1}$ and $y_{t-2}$, taking expectations, and solving the Yule-Walker system with 3 equations and 3 unknowns. The trick for AR(1) models, where one takes expectations of both sides, doesn't slide here because you get a $\mathrm{Cov}(y_{t-1}, y_{t-2})$ term.

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
user369210
  • 231
  • 1
  • 2
  • 6
  • 1
    Regarding (1), it must depend on the context. An answer claiming otherwise would be careless, IMHO. – Richard Hardy Jan 19 '17 at 17:39
  • Regarding (2), the variance of the AR(2) model will depend on the first and second order autocovariances. As you say, in order to get its value you will have to solve the Yule-Walker equations; by doing so, I think you will get $Var(y_t)=\frac{(1-\beta_2)\sigma^2_\varepsilon}{(1+\beta_2)[(1-\beta_2)^2-\beta_1^2]}$. – javlacalle Jan 19 '17 at 18:10
  • +1 @RichardHardy Please allow me to add a plug for precisely communicating one's methods, and results! – Alexis Jan 15 '20 at 18:36

3 Answers3

2

Stationarity

Considering your AR(2) process with mean zero i.i.d. noise $\epsilon_t$ of variance $\sigma_\epsilon^2$,

$$ y_t = \beta_1 y_{t-1} + \beta_2 y_{t-2} + \epsilon_t \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,(*) $$

we can rewrite it in terms of the lag operator $L$

$$ (1 - \beta_1 L - \beta_2 L^2)y_t = \epsilon_t $$

so that we have a new operator

$$ 1 - \beta_1 L - \beta_2 L^2 $$

If the above were a polynomial in $L$, let its roots be $z_1^{-1}, z_2^{-1}$. We call the polynomial the characteristic polynomial, and $z_1, z_2$ its factors. (Note the factors and roots are inverses of each other, and may be complex.)

It can be shown that the AR(2) is stationary when $|z_1| < 1$ and $|z_2|<1$. i.e. when all of the following are met: $$ |\beta_2| < 1 \\ \beta_2 + \beta_1 < 1 \\ \beta_2 - \beta_1 < 1 $$

For details, see this answer. Or in terms of the more general ARMA(p,q) process, see Introduction to Time Series and Forecasting. Brockwell and Davis. 2016. p 74.

Variance

If the process is stationary, we can write the covariance as a function of increment alone, so let the covariance function $\gamma(k) \doteq E[y_t y_{t+k}]$. We can find the variance $\gamma(0)$ by squaring and taking the expectation of both sides of equation $(*)$ with the following result

$$ \gamma(0) = \beta_1^2 \gamma(0) + \beta_2^2 \gamma(0) + 2 \beta_1 \beta_2 \gamma(1) + \sigma_\epsilon^2 \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,(**) $$

Starting again from equation $(*)$, this time multiply both sides by $y_{t-k}$ and again take the expectation

$$ \gamma(k) = \beta_1 \gamma(k-1) + \beta_2 \gamma(k-2) $$

Usefully, we now have an expression for covariance, but we just need it to compute $\gamma(1)$. Since $\gamma(-1) = \gamma(1)$ (examine the definition, note that covariance is not a function of $t$), we can let $k=1$ in the above equation and get

$$ \gamma(1) = \frac{\beta_1 \gamma(0)}{1-\beta_2} $$

Substituting into $(**)$, we get the variance

$$ \text{Var}(y_t) = \gamma(0) = \frac{(1-\beta_2)\sigma_\epsilon^2}{(1+\beta_2)(1 - \beta_1 - \beta_2)(1 + \beta_1 - \beta_2)} $$

As a sanity check, our stationarity conditions on $\beta_1, \beta_2$ described earlier are precisely the conditions which make our expression for $\text{Var}(y_t)$ positive.

The above is essentially a summary and rearrangement of some parts of these notes.

Strong or Weak Stationarity

I have only seen arguments of AR process stationarity in terms of fixed mean and variance, so weak stationarity is implied. However, if the stationary distribution can be characterized completely in terms of those first and second moments, then we also have strong stationarity. See A unified view of linear AR(1) models. G.K. Grunwald. 1996.

As an example, if we have an AR(1) process and $\epsilon_t$ is i.i.d. Gaussian, then the process' stationary distribution is also Gaussian. Since the Gaussian is fully specified by its first two moments, we have strong stationarity in that case. I am unsure whether or not this also applies to more general AR(p) Gaussian processes, or to AR(p) processes with other kinds of i.i.d. noise.

kdbanman
  • 560
  • 1
  • 5
  • 11
  • I have seen the notes you mention above. However I have not been able to find your last equation from (**) your $\gamma(1)$ Equation. Could you please help? Thakns – Herman Jaramillo Feb 09 '22 at 20:07
  • Can you include more detail @Herman? I don’t understand your question yet. – kdbanman Feb 10 '22 at 15:57
  • Unfortunately there is no room here to show you my computations. I wish I can include a file on this comment. I cannot get your equation for Var(y_t) doing what you are saying. That is, plugging he expression for $\gamma(1)$ in Equation (**). Thanks – Herman Jaramillo Feb 11 '22 at 13:14
  • I did it my own way. I will post my computations. – Herman Jaramillo Feb 11 '22 at 17:07
0

Stack it, i.e. write as VAR(1) for the vector x(t) = [y(t),y(t-1)]', x(t) = [beta1,beta2;1,0] x(t-1) + [epsilon(t),0]'. Vectorize the resulting matrix equation and exploit vec(ABC)=(C' kron A) vec(B)

  • 3
    Welcome to CV, Harald Uhlig. Your answer would be made much more clear by editing it to use [MathJax](https://math.meta.stackexchange.com/questions/5020/mathjax-basic-tutorial-and-quick-reference) (a LaTeX-like mark-up language). You can get a feel for this by examining a CV question or answer by clicking it's "edit" button (not that you need to actually suggest an edit when doing so! :). You can similarly edit your own answer using the "edit" link at lower left. – Alexis Jan 15 '20 at 18:40
0

From $X_t = \phi_1 X_{t-1} + \phi_2 X_{t-2} $,

\begin{eqnarray*} E[ X_t X_{t-k}] &=& \phi_1 E[X_{t-1} X_{t-k}] + \phi_2 E[X_{t-2} X_{t-k}] + E[X_t X_{t-1}] \\ \gamma_k &=& \phi_1 \gamma_{k-1} + \phi_2 \gamma_{k-2} + E[Z_t X_{t-k}] \quad (*) \end{eqnarray*} Now, \begin{eqnarray} E[Z_t X_{t-k}] = \left \{ \begin{array}{cc} \sigma_Z^2 & \text{if } k = 0 \\ 0 & \text{if } k=1,2, \cdots \end{array} \right . (**) \end{eqnarray}

Then combining (*) with (**) and using the symmetry property of the autocorrelation function. That is, $\gamma_t = \gamma_{-t}$

\begin{eqnarray} \sigma^2 = \gamma_0 &=& \phi_1 \gamma_{1} + \phi_2 \gamma_{2} + \sigma_Z^2 \quad , \quad (****) \\ \gamma_k &=& \phi_1 \gamma_{k-1} + \phi_2 \gamma_{k-2} \quad , \quad k=1,2, \cdots. \quad , \quad (***) \end{eqnarray}

We need to find $\gamma_1$ and $\gamma_2$. From () \begin{eqnarray} \gamma_1 = \phi_1 \gamma_0 + \phi_2 \gamma_{-1} = \phi_1 \gamma_0 + \phi_2 \gamma_1 \\ \gamma_1 = \frac{\phi_1 \gamma_0}{1 - \phi_2} = \frac{\phi_1 \sigma^2}{1 - \phi_2} \end{eqnarray} From () with $k=2$ we find that \begin{eqnarray} \gamma_2 = \phi_1 \gamma_1 + \phi_2 \gamma_0 = \phi_1 \frac{\phi_1 \sigma^2}{1-\phi_2} + \phi_2 \sigma^2 = \sigma^2 \left ( \frac{\phi_1^2 + \phi_2(1 - \phi_2) }{1 - \phi_2}\right ) \quad , \quad (*****) \end{eqnarray}

Then by pluggin this in (****) $$ \sigma^2 = \phi_1 \left ( \frac{\phi_1 \sigma^2}{1 - \phi_2} \right ) + \phi_2 \sigma^2 \left ( \frac{\phi_1^2 + \phi_2(1 - \phi_2) }{1 - \phi_2}\right ) + \sigma_Z^2$$

That is,

\begin{eqnarray} \sigma^2 \left [1 - \left ( \frac{\phi_1^2}{1 - \phi_2} \right ) -\phi_2 \left ( \frac{\phi_1^2 + \phi_2(1 - \phi_2) }{1 - \phi_2}\right ) \right ] = \sigma_Z^2 \end{eqnarray}

We simplify the quantity in brackets. That is:

\begin{eqnarray} 1 - \left ( \frac{\phi_1^2}{1 - \phi_2} \right ) -\phi_2 \left ( \frac{\phi_1^2 + \phi_2(1 - \phi_2) }{1 - \phi_2}\right ) &=&\frac{1 -\phi_2 - \phi_1^2 - \phi_2 \phi_1^2 - \phi_2^2 + \phi_2^3}{1 - \phi_2} \\ &=& \frac{(1 - \phi_2^2) - \phi_1^2(1 + \phi_2) + \phi_2(\phi_2^2 -1)}{1 - \phi_2} \\ &=& \frac{(1-\phi_2)(1 + \phi_2) - \phi_1^2(1 + \phi_2) - \phi_2(1 -\phi_2)(1 + \phi_2) }{1-\phi_2} \\ &=& \frac{(1 + \phi_2) (1 - \phi_2 - \phi_1^2 - \phi_2(1 - \phi_2))}{1 - \phi_2} \\ &=& \frac{(1 + \phi_2) (1 - 2 \phi_2 + \phi_2^2 - \phi_1^2)}{1 - \phi_2} \\ &=& \frac{(1 + \phi_2)[( 1 - \phi_2)^2 - \phi_1^2]}{1 - \phi_2} \\ &=& \frac{(1+ \phi_2)( 1 - \phi_2 - \phi_1)(1 - \phi_2 + \phi_1)}{1 - \phi_2} \end{eqnarray} We find then that \begin{eqnarray} \sigma^2 = \frac{(1 - \phi_2) \sigma_Z^2}{(1+ \phi_2)( 1 - \phi_2 - \phi_1) (1 - \phi_2 + \phi_1)} \end{eqnarray}