Is it possible to fit multivariate Gaussian models implied by mixed-models through generalised least squares in R, by using, for instance, the gls
function?
For instance, the random intercept model via lme
is
mod.lme <- lme(score~Machine, random = ~1|Worker, data=Machines)
s2.lme <- as.numeric(VarCorr(mod.lme)[2,1]) #residual variance
s2.ranef.lme <- as.numeric(VarCorr(mod.lme)[1,1]) # ran. eff. variance
(tot.var.lme <- s2.lme + s2.ranef.lme) # sum of variance components
The corresponding Gaussian model can be fitted by gls
as follows:
mod.gls <- gls(score~Machine, correlation = corCompSymm(form = ~1| Worker),
data=Machines)
(tot.var.gls <- as.numeric(exp(attributes(mod.gls$apVar)$Pars[2]))^2)
mSt <- mod.gls$modelStruct
cSt <- mSt$corStruct
(rho <- coef(cSt, unconstrained = FALSE))
all.equal(tot.var.gls, tot.var.lme) # total variances equal?
s2.ranef.gls <- rho*tot.var.gls # get variance of the ran. eff from gls
all.equal(as.numeric(s2.ranef.gls), s2.ranef.lme) # is equal to lme ?
But what about this ?
mod2.lme <- lme(score~Machine, random = ~1|Worker/Machine, data = Machines)
How would you fit it by gls
? Is it possible ?