Another variant of multiple decision trees is the popular technique of boosting, which are a family of algorithms that convert “weak learners” to “strong learners.” The underlying principle of boosting is to add weights to iterations that were misclassified in earlier rounds.