Perpetual: a hyperparameter-free gradient boosting machine

PerpetualBooster is a gradient boosting machine (GBM) algorithm which doesn't have hyperparameters to be tuned so that you can use it without hyperparameter optimization packages unlike other GBM algorithms. Hyperparameter optimization usually takes 100 iterations with plain GBM algorithms. PerpetualBooster achieves the same accuracy in the single run. Thus, it achieves around 100x speed-up at the same accuracy. Show your love and star the repo.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.