3

Definition of an AR(1) process

In an Autoregressive Process, a time series can be generated based on a stochastic difference equation.

\begin{align} X_t = c + \phi \, X_{t-1} + \epsilon \end{align}

Typically, $\epsilon$ is chosen to be normally distributed and $0 < \phi < 1$.

What I already know

As described on Wikipedia, as well as on Stack Exchange, one can derive the mean and the variance of the resulting time series if one knows the parameters $c$ and $\phi$, as well as the distribution of $\epsilon$.

\begin{align} Mean(X_t) & = Mean(c + \phi \, X_{t-1} + \epsilon) \\ & = Mean(c) + Mean(\phi \, X_{t-1}) + Mean(\epsilon) \\ & = c + \phi \, Mean(X_{t-1}) + 0 \\ \end{align}

Assuming that $Mean(X_{t}) = Mean(X_{t-1})$, it follows that:

\begin{align} Mean(X_t) = \frac{c}{1 - \phi} \end{align}

Similarly for the variance:

\begin{align} Var(X_t) & = Var(c + \phi \, X_{t-1} + \epsilon) \\ & = Var(c) + Var(\phi \, X_{t-1}) + Var(\epsilon) \\ & = 0 + \phi^2 \, Var(X_{t-1}) + \sigma_{\epsilon}^2 \\ \end{align}

Assuming that $Var(X_{t}) = Var(X_{t-1})$, it follows that:

\begin{align} Var(X_t) = \frac{\sigma_{\epsilon}^2}{1 - \phi^2} \end{align}

My question

How can this be generalised to higher order moments, central moments and/or cumulants?

Following this post, is it correct to assume that:

\begin{align} \mu_k(X_t) = \frac{\mu_k(\epsilon)}{1 - \phi^k} \end{align}

On Wikipedia, it is noted that:

...the central limit theorem indicates that $X_t$ will be approximately normally distributed when $\phi$ is close to one.

Which makes me doubt whether the above generalisation is correct.

Secondly, $\mu_k$ refers to cumulants. Is there a similar expression for moments and/or central moments?

Context

I would like to end up with an AR(1) process which generates data with a particular non-normal distribution. I want to be able to specify the desired mean, standard deviation and skewness and subsequently work back what distribution I need for $\epsilon$ to give me the expected result.

LBogaardt
  • 494
  • 1
  • 4
  • 18
  • 1
    The AR(1) process with normal noise is a discrete-time version of the Ornstein-Uhlenbeck Markov process, which has as stationary distribution the Gaussian distribution. If you replace $X_{t-1}$ with the log-gradient of your non-normal distribution, you will approximately draw samples from that distribution. – Forgottenscience Sep 23 '19 at 14:31
  • @Forgottenscience What do you mean with 'log-gradient'? – LBogaardt Sep 23 '19 at 16:06
  • 1
    Say your distribution of interest is $p(x)$, then the derivative of $\log p(x)$ with respect to $x$ would be the log-derivative; if $x$ was a vector it would be the log-gradient. See for example here https://en.wikipedia.org/wiki/Metropolis-adjusted_Langevin_algorithm – Forgottenscience Sep 23 '19 at 16:21

1 Answers1

4

Your formulas for $\mbox{Var}X_t$ and the higher order cumulants $\mu_k(X_t)$ are both correct but note that the skewness of $X_t$ (the third standardised moment) is
$$ \mbox{Skew}X_t = \frac{\mu_3(X_t)}{\mu_2(X_t)^{3/2}}=\frac{\mu_3(\epsilon_t)(1-\phi^2)^{3/2}}{\mbox{Var}(\epsilon_t)^{3/2}(1-\phi^3)}=\mbox{Skew}(\epsilon_t)\frac{(1-\phi^2)^{3/2}}{(1-\phi^3)} $$ In contrast to $\mu_3(X_t)$ this goes to zero as expected from the central limit theorem as $\phi$ goes to 1 no matter how skewed the white noise is.

Jarle Tufto
  • 7,989
  • 1
  • 20
  • 36
  • Interesting. So adjusting the distribution of $X_t$ is not possible by adjusting the distribution of $\epsilon$ as it will always tend to the normal distribution? What other strategies are there to obtain a non-normal distribution for $X_t$? – LBogaardt Sep 23 '19 at 14:48
  • No, as long as $\phi<1$ you are not in this limiting case, so you can get any skew you want in $X_t$ since there is no upper bound on the skew of $\epsilon_t$. – Jarle Tufto Sep 23 '19 at 14:53
  • True, but for $\phi \approx 1$, it would require quite extreme values of $\epsilon$ to achieve the desired distribution of $X_t$. Perhaps it is easier (more stable?) to use a transformation $f$ on $X$ such that $f(X_t)$ is normally distributed. (Or would these end up mathematically equivalent to each other? Thinking out loud here...) – LBogaardt Sep 23 '19 at 16:00
  • 1
    But for non-linear $f$ the autocovariance function of the transformed process $Y_t=f(X_t)$ would not have exponential decay so $Y_t$ would not be AR(1). – Jarle Tufto Sep 23 '19 at 16:12
  • So you're saying that non-linearly transforming an $AR(1)$ process yields (in general) an $AR(\infty)$. That might be something I am willing to accept. Do you (or anyone else) know of alternative methods of getting a mean-reverting Markov time series with a non-normal distribution? – LBogaardt Sep 23 '19 at 20:18
  • 2
    This seems related https://stats.stackexchange.com/questions/180109/generate-a-random-variable-which-follow-gamma-distribution-and-ar1-process-sim – Jarle Tufto Sep 24 '19 at 08:15
  • Also note that cumulants are identical to central moments only up for order 3 (see https://en.wikipedia.org/wiki/Cumulant#Cumulants_and_moments). The expression above can be generalised easily for central moments, but requires additional terms for cumulants. – LBogaardt Sep 24 '19 at 14:05
  • @JarleTufto - could these results be applied to find the cumulants of a general ARMA(p,q) process ? If so, how? – hydrologist Jan 11 '22 at 15:17