Adaboost

  • Adaboost is same like Random Forest, but instead of a full Decision Tree, it uses Stump
    • So it can be called Forest of Stumps
  • In contrast to Random Forest, some Stumps get more voting capability than other stumps
  • In Random Forest, tree were made independently,
    • But in AdaBoost, trees are made dependent on the error made from the previous one, same like Boosting
  • Sometimes use the Weighted Gini Index to evaluate decision tree

Steps:

  1. Give same weight (importance) to all data
  2. Create one stamp with less total error
  3. calculate amount_of_say for this stump
  4. Update the weight of the data
    1. so that misclassified points get more weight
    2. less weight to the correctly classified ones
  5. Go to Step 1 and continue
  6. Stop when predetermined number_of_estimator has reached

Evaluation:

  1. During evaluation, for classification,
    1. Get the class from each of the estimator
    2. Sum the weight of amount_to_say and take the class with most weight

Pasted image 20231106231919.png