0

I'm interested in to know a bit more on what the scale parameter of the dccfit fit.control option is about.

Here is the code for the model I am estimating:

Specifications:

xspec = ugarchspec(mean.model = list(armaOrder = c(1, 0)), variance.model =
   list(garchOrder = c(1,1), model = 'gjrGARCH'), distribution.model = 'norm')
   uspec = multispec(replicate(2, xspec)) # 2 is the number of variables

dccspec = dccspec(uspec = uspec, dccOrder = c(1, 1), model='aDCC', 
   distribution = 'mvnorm')

Estimation: Option 1) NO SCALING:

dcc.fit.focast = dccfit(dccspec, data = tst, out.sample = 2600, fit.control =       
    list(eval.se=T))

Option 2) SCALING:

dcc.fit.focast = dccfit(dccspec, data = tst, out.sample = 2600, fit.control = 
    list(scale=FALSE), fit.control = list(eval.se=T))

The difference between option 1 and option 2 is that the latter has an extra code:

fit.control=list(scale=FALSE)

Parameters estimated are very different between these two options. I would like to get some reference in which I can read about, just to make myself sure that I am not missing anything when using this option. Scaling the data provides successful estimates that couldn't be achieved without it. When scaling is off, then my GARCH parameters are fixed.

Which way is the correct procedure here?

Here is the comparison I got, switching scaling option on/off. The model is a standard GJR-GARCH(1,1) with proper ARFIMA model.

Parameter estimates of option 1 (NO SCALING):

alpha1 0.050000, beta1  0.900000, gamma1 0.050000 

Parameter estimates of option (SCALING) 2:

alpha1 0.032907, beta1  0.794693, gamma1 0.056336 
Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
Martin
  • 1
  • I think you might not have a *proper ARFIMA model* there. Looking at the code it seems that your conditional mean model is an AR(1), or ARMA(1,0). This need not be a problem, though. – Richard Hardy Dec 19 '17 at 13:58
  • I thought about this and tried running the model with 1) log returns and 2) residuals of proper arfima model. of course, when i run the model with the residuals, I adjusted the specification of "mean.model." in both cases the parameter estimates of GARCH model were the same when scaling was turned off. Meaning that the problem still there. – Martin Dec 21 '17 at 00:31
  • I meant the problem of having AR(1) in place of ARFIMA, not the problem your post focuses on. I just wanted to point out that the specification of the conditional mean might not be the one you think it is. – Richard Hardy Dec 21 '17 at 08:24
  • Let me know if the answer is unclear or unhelpful. Otherwise, you may consider accepting it by clicking on the tick mark to the left of it. This is [how Cross Validated works](https://stats.stackexchange.com/tour). – Richard Hardy Dec 30 '17 at 11:19

1 Answers1

1

Checking the help file for the dccfit function you find that scale governs the univariate GARCH estimation. Checking the help file for the ugarchfit function, you see that

The scale parameter controls whether the data should be scaled before being submitted to the optimizer.

Getting different results with scale=TRUE and scale=FALSE indicates that the optimizer is sensitive to scale. It is difficult to tell which result (scaled or nonscaled) is more correct, but I would expect that scaling is carried out in a smart way (though I found no details on that in the help files) such that the optimizer performs better (converges somewhere closer to the global minimum and/or converges faster) with scaling than without. Hence, I would expect the result from scale=TRUE to be the more trustworthy one. However, for some reason the default in the dccfit function is scale=FALSE, for which I do not have a good explanation.

Richard Hardy
  • 54,375
  • 10
  • 95
  • 219
  • Needless to say that DCC is a two stage approach and in the second stage where DCC parameters are estimated, the residuals are standardized by design. In the firs stage, the residuals are not standardized. Probably that's why the default in the `dccfit` function is `scale=FALSE` because there is no standardization in the first stage where GARCH model parameters are estimated. What I don't understand is why the model gives constant same parameter estimates even for different time series when scaling is off. – Martin Dec 21 '17 at 00:38
  • I used to run the model with scaling off and I didn't have this problem. Parameter estimates all made sense. For some reasons, now it produces this weird outcome. – Martin Dec 21 '17 at 00:43
  • The `scale` option handles a different form of standardization, not the one where the raw residuals are scaled by the estimated standard deviations due to the univariate GARCH model. What I believe `scale` does is scaling the original data before submitting it to the optimization routines, and then scaling the results back after the optimization has taken place. Now regarding the results you get with scaling off, could you be more specific? What do you mean by *the model gives constant same parameter estimates even for different time series*? – Richard Hardy Dec 21 '17 at 08:28