I would like to understand how GARCH models work but I'm having some problems. I have a highly persistent AR time series and I would like to model the conditional mean as well as its conditional variance. I thought of 2 possible ways:
- Estimate an AR(1) model, obtain the residuals, fit a GARCH(1,1) to the residuals. The first model models the mean, the second model the variance.
- Estimate an ARMA-GARCH model.
I assumed, that both ways should yield the same results in terms of coefficients but that's not the case. Why do the AR coefficients differ?
I have created a reprex in R. You only need the fGarch
and tseries
packages besides base.
library(fGarch)
library(tseries)
# Simulate time series with heteroscedasticity
set.seed(1)
y <- c(0)
for(i in 2:5000) {
y[i] <- 0.99*y[i-1] + rnorm(1, sd = (sin(i/300)+1)^2)
}
plot(y, type = "l")
# Model Estimation
# Estimate AR 1
model_arma <- arma(y, order = c(1,0,0))
model_arma$coef
#> ar1 intercept
#> 0.9820884 -0.0391143
resids <- residuals(model_arma)[-1]
# Estimate Garch(1,1)
model_garch <- garchFit(~garch(1,1), trace = FALSE)
model_garch@fit$matcoef
#> Estimate Std. Error t value Pr(>|t|)
#> mu -0.006190316 0.008461996 -0.7315433 4.644474e-01
#> omega 0.010761385 0.002837514 3.7925399 1.491143e-04
#> alpha1 0.153134061 0.026421658 5.7957780 6.800519e-09
#> beta1 0.805973743 0.033381268 24.1444916 0.000000e+00
# Estimate Arma-Garch
model_armagarch <- garchFit(~arma(1,0) + garch(1,1), trace = FALSE)
model_armagarch@fit$matcoef
#> Estimate Std. Error t value Pr(>|t|)
#> mu -0.006097132 0.008400905 -0.7257709 4.679792e-01
#> ar1 0.051377942 0.025641637 2.0036920 4.510307e-02
#> omega 0.011189102 0.002820163 3.9675370 7.261924e-05
#> alpha1 0.157402606 0.026259893 5.9940306 2.047028e-09
#> beta1 0.799952222 0.032890817 24.3214454 0.000000e+00
Created on 2019-12-20 by the reprex package (v0.3.0)