I'm using R's kappa2
function (irr
package) to obtain Cohen's Kappa and the cohen.kappa
function (psych
package) for the corresponding confidence intervals. In case of significant interrater agreement I usually obtain a p-value ≤ 0.05 and a 95% CI that doesn't include 0. Now, for the first time, the output is a significant p-value, but the 95% CI spans 0. Is there another explanation for this discrepancy, other than my code being erroneous?
> x <- matrix(c(y,z), nrow=49, ncol=2, byrow=FALSE)
> kappa2(x, weight="unweighted", sort.levels="FALSE")
Cohen's Kappa for 2 Raters (Weights: unweighted)
Subjects = 49
Raters = 2
Kappa = 0.222
z = 2.47
p-value = 0.0133
> cohen.kappa(x)
Call: cohen.kappa1(x = x, w = w, n.obs = n.obs, alpha = alpha, levels = levels)
Cohen Kappa and Weighted Kappa correlation coefficients and confidence boundaries
lower estimate upper
unweighted kappa -0.14 0.22 0.59
weighted kappa -0.14 0.22 0.59
Number of subjects = 49