Search code examples
machine-learningclassificationdecision-treeadaboost

What is an example of using Adaboost (Adaptive Boosting) approach with Decision Trees


Is there any good tutorial that explains how to weight the samples during successive iterations of constructing the decision trees for a sample training set? I want to specifically how to the weights are assigned after the first decision tree is constructed.

Decision tree is designed using Information Gain as an anchor and I am wondering how is this affected due to the misclassifications in the previous iterations being weighted.

Any good tutorial / example is highly appreciated.


Solution

  • A Short Introduction to Boosting from Freund and Schapire supplies an example of the AdaBoost algorithm using Quinlan's C4.5 Decision Tree model.