Explore tens of thousands of sets crafted by our community.
Ensemble Learning Techniques
10
Flashcards
0/10
Stacking
Stacking involves training multiple different models and then training a meta-model on the predictions of these models to make a final prediction.
Voting
Voting combines predictions from multiple models by majority vote for classification or averaging for regression.
Model Averaging
Model averaging improves predictive performance by averaging the predictions of multiple models, potentially reducing variance without increasing bias.
Gradient Boosting
Gradient Boosting builds an additive model in a forward stage-wise fashion and allows for the optimization of arbitrary differentiable loss functions.
Homogeneous vs Heterogeneous Ensembles
Homogeneous ensembles use the same type of model multiple times, whereas heterogeneous ensembles combine different types of models to capture various hypotheses.
Boosting
Boosting combines weak learners sequentially with each model attempting to correct the errors of its predecessor, often resulting in a strong learner.
Bagging (Bootstrap Aggregating)
Bagging reduces variance by training multiple models using different bootstrapped training sets and averaging the results.
AdaBoost (Adaptive Boosting)
AdaBoost works by weighting instances in the dataset by how easy or hard they are to classify, allowing the algorithm to pay more or less attention to them in the subsequent models.
Random Forests
Random Forests are an ensemble of decision trees, generally trained with the bagging method, that aim to reduce variance and prevent overfitting.
Feature Bagging
Feature Bagging involves creating multiple subsets of features and then training a model on each to ensure that the features' importance can be accurately estimated.
© Hypatia.Tech. 2024 All rights reserved.