I am currently working with a slightly imbalanced dataset (9% positive outcome) and am using XGBoost to train a predictive model.
XGB = XGBClassifier(scale_pos_weight = 10)
Before calibration, my sensitivity and specificity are around 80%, but the calibration curve has slope 0.5
After calibration, the calibration curve looks great (slope = 0.995), but sensitivity and specificity decreased dramatically. Is a side effect of the calibration? Any thoughts on how to maintain my classification accuracy?
Thanks!