Table of Contents
What is ensemble learning?
Ensemble learning is a machine-learning approach where we train numerous simple models and aggregate their predictions.
These simple models are called weak learners.
What is an ensemble model?
As inferred from the definition above, an ensemble model is a model composed of several small models that work together during the predictions phase.
The types of ensemble learning training
ML experts categorize ensemble models into two big groups:
- Parallel models. The weak learners are trained parallel and independently from one another.
- Sequential models. The weak learners are trained sequentially.

Bagging
Bagging is a parallel ensemble learning technique.
In bagging, weak learners are trained on samples of the training dataset with random observations. When predicting, the final prediction is the average of the weak learner’s guesses.
This technique is useful for reducing variance.
Boosting
Boosting is a sequential ensemble learning technique where new model is trained to correct the previous weak learner errors.
There are two types of boosting:
- In AdaBoost (adaptive boosting), when training a new weak learner the observations previously mistaken are emphasized.
When predicting each model’s contribution is scaled depending on its accuracy. - In Gradient boosting, each new model is trained to predict the previous model’s error, slowly improving accuracy. The final prediction is the sum of all model contributions.
This technique is useful for reducing bias.
Benefits of ensemble models
- Ensemble models are robust, safe, and reliable because the error of a single model can be corrected or alleviated by the others.
- The robustness of ensemble models allows them to be more accurate and make better predictions than single models.
Drawbacks of ensemble models
- Ensemble models are computationally expensive and time-consuming because the algorithm builds and stores multiple base models.
- Since ensemble models are more complex than base models, their decision-making logic is less interpretable and explainable.