Machine Learning Model Performance: Boosting, Evaluation, and Validation
Posted by Anonymous and classified in Mathematics
Written on in English with a size of 12.88 KB
AdaBoost: Adaptive Boosting Algorithm Explained
AdaBoost (Adaptive Boosting) is a classic and widely used boosting algorithm that focuses on correcting the errors of preceding weak learners (typically decision trees). It works by iteratively adjusting the weights of the training data points.
How AdaBoost Works
- Initial Weights: AdaBoost starts by assigning equal weights to all the training data points.
- Train a Weak Learner: A "weak" learner (a model that performs slightly better than random chance, like a decision stump) is trained on the dataset using the current weights.
- Calculate Error and Performance: The error rate of the weak learner is calculated based on the instances it misclassified. A measure of the weak learner's performance (often called