4

In scikit-learn implementation of AdaBoost you can choose a learning rate. The documentation about AdaBoost says:

"Learning rate shrinks the contribution of each classifier by learning_rate".

This learning rate is not mentioned anywhere else on AdaBoost tutorials and explanations (for example the official one here http://rob.schapire.net/papers/explaining-adaboost.pdf)

I don't understand this concept of shrinkage of extra trees. Is the learning rate a shrinkage factor, such as 2 means each trees importance is divided by 2? That would be strange as the AdaBoost models learns the importance of each estimator and sample itself.

Thanks for your help!

http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.AdaBoostClassifier.html

Nicolas
  • 41
  • 1
  • 3

0 Answers0