I am running what I suppose is the same mixed-effect model with a negative binomial distribution (log link) in both lme4 and the glmmTMB package in R. Code shown below:
mNB<-glmer.nb(size~ scale(Group1) + (1|PairID), weights=w, verbose=T, data=imp.NB)
which gives the warning
boundary (singular) fit: see ?isSingular
and summary
summary(mNB)
Generalized linear mixed model fit by maximum likelihood (Laplace Approximation) ['glmerMod']
Family: Negative Binomial(0.0515) ( log )
Formula: size ~ scale(Group1) + (1 | PairID)
Data: imp.NB
Weights: w
AIC BIC logLik deviance df.resid
-451160.3 -451137.1 225584.1 -451168.3 2458
Scaled residuals:
Min 1Q Median 3Q Max
17.57 21.73 79.29 218.52 2655.68
Random effects:
Groups Name Variance Std.Dev.
PairID (Intercept) 0 0
Number of obs: 2462, groups: PairID, 183
Fixed effects:
Estimate Std. Error z value Pr(>|z|)
(Intercept) -4.491636 0.007533 -596.276 < 2e-16 ***
scale(Group1) -0.031555 0.007361 -4.287 1.81e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Correlation of Fixed Effects:
(Intr)
scale(Grp1) -0.023
convergence code: 0
boundary (singular) fit: see ?isSingular
Meanwhile this gives no complaints and summary
t2NB<-glmmTMB(size~scale(Group1) + (1|PairID), weights=w, family=nbinom2, verbose=T, data=imp.NB, ziformula=~0)
summary(t2NB)
Family: nbinom2 ( log )
Formula: size ~ scale(Group1) + (1 | PairID)
Data: imp.NB
Weights: w
AIC BIC logLik deviance df.resid
86472.4 86495.6 -43232.2 86464.4 2458
Random effects:
Conditional model:
Groups Name Variance Std.Dev.
PairID (Intercept) 0.3382 0.5816
Number of obs: 2462, groups: PairID, 183
Overdispersion parameter for nbinom2 family (): 1.02
Conditional model:
Estimate Std. Error z value Pr(>|z|)
(Intercept) 1.983829 0.046852 42.34 <2e-16 ***
scale(Group1) 0.007061 0.034005 0.21 0.836
I have reason to believe the weights argument causes the singularity as discussed in this post. But I want to know why one converges and not the other? Can I trust the results from glmmTMB?
EDIT
Note: running both models without the weights gives nearly identical results. While running the lme4 model with weights gives 0 variance for the random effect.