Assuming a classification problem, the AdaBoost algorithm can be described on a high-level basis, from its basic steps. For regression purposes, the steps are similar:
- Initialize all of the train set instance's weights equally, so their sum equals 1.
- Generate a new set by sampling with replacement, according to the weights.
- Train a weak learner on the sampled set.
- Calculate its error on the original train set.
- Add the weak learner to the ensemble and save its error rate.
- Adjust the weights, increasing the weights of misclassified instances and decreasing the weights of correctly classified instances.
- Repeat from Step 2.
- The weak learners are combined by voting. Each learner's vote is weighted, according to its error ...