0

in XGBoost, is there a way to programmatically get the training and evaluation error per iteration of training?

Will train until eval error hasn't decreased in 25 rounds.

[0] train-rmspe:0.996873    eval-rmspe:0.996881

[1] train-rmspe:0.981762    eval-rmspe:0.981795

[2] train-rmspe:0.939323    eval-rmspe:0.939396

[3] train-rmspe:0.859087    eval-rmspe:0.859009

[4] train-rmspe:0.747877    eval-rmspe:0.747098

[5] train-rmspe:0.622528    eval-rmspe:0.619955

I want to get these numbers and graph them. They really help visualize parameter tuning.

1 Answers1

1

Yes, you can use xgb.cv like so:

import xgboost as xgb
d = xgb.DMatrix(X_train, label=y_train.label)
p = {"learning_rate":0.05}

res = xgb.cv(params = p, dtrain=d, num_boost_round=300, verbose_eval=50, metrics="rmse")    

And you'll get the metrics you're after per round and even their standard deviation cross-validated

ihadanny
  • 2,596
  • 3
  • 19
  • 31