Search code examples
pythonscikit-learnadaboost

AdaBoosting with several different base estimators at once


I know you can AdaBoost with multiple instances of a single model (e.g., 600 Decision Trees, Bayesian Ridges, or Linear Models). Is it possible to AdaBoost with a gauntlet of models at the same time, and how?

AdaBoost([DecisionTree, BayesianRidge, LinearRegressor, ...])

Each standalone model has its pros and cons, and I was wondering if it was possible to mash them all together under an umbrella.


Solution

  • Since AdaBoost is decoupled from the weak classifier's implementation: Yes, you can. These architectures are called hybrid ensembles. However, it's not always needed as AdaBoost only requires the weak classifiers to be slightly better than random. Thus, it's often enough to have a homogenous ensemble of classifiers. In the end you'll have to test it for your own scenario if it gives you a performance boost.