0

I need to estimate this univariate garch model with the following discription

My model (regression):

Yt=mu + gamma * Gt + et
Gt is GIVEN

Where the crucial part is that: et= sqrt(Gt) * sqrt(ht) * Zt
where: zt∼N(0,1)

Variance equation:

ht=omiga + alpha * (et-1)^2 + beta * ht-1 # GARCH(1,1)

I used rugarch package in R and know how to add some regressor to the mean:

spec_1 <- ugarchspec(variance.model = list(model = "sGARCH", garchOrder = c(1, 1), 
                                         submodel = NULL, external.regressors = NULL, variance.targeting = FALSE), 
                   mean.model = list(armaOrder = c(0, 0),include.mean = TRUE, external.regressors = matrix(g)), 
                   distribution.model = "norm")

garch_1 <- ugarchfit(spec=spec_1,data=Y[,1],solver.control=list(trace=0))

My quession now is: How to multiply my error term by the square root of this observed g ???

I tried to manupilate the equation by deviding the both sides by sqrt(g) but this leads to misestimate the garch parameter.

(Info: This specification arised after the E_step in the EM algorithm for the generalized hyperbolic distribution)

Ananas
  • 1
  • 1

0 Answers0