2

I am trying to tune my model (An ensemble model with xgboost, lightbgm and catboost) with hyperopt, but I don't know how to find the optimal n_estimators value. When I use the best_parameters returned by fmin to train the model, apparently I have to find the optimal n_estimator value (through trial and error) that recreates the smallest loss.

Is there any way to know at which iteration I get that lowest loss?

  • 0. Welcome to CV.SE. 1. To state the obvious: `xgboost`, `lightbgm` and `catboost`is extremely unlikely that they share the same optimal number of estimators. 2. I would suggest circumventing this by using early stopping. – usεr11852 Jan 04 '22 at 01:07

0 Answers0