In this post, let us explore:
- Ensemble Models
Ensembling is clubbing predictions from different models to get better performance. How to club different predictions from different models to get a single prediction? There are different ways of doing it.
These are the major types of ensembles. To understand bagging and boosting very clearly, I recommend you to watch these two YouTube videos: bagging and boosting.
Let us understand these three types of ensembles, one by one.
- Bootstrap aggregating (Bagging)
Sometimes, bagging method is also called averaging method. In bagging, we build models independently (or parallel estimation). Then average the predictions in case of regression problems, use the mode in case of classification tasks.
The main intention here is to reduce the over-fitting (reduce the variance). Base estimators are strong models such as fully developed decision trees.
Examples: BaggingClassifier, BaggingRegressor and Random Forests (RandomForestClassifier and RandomForestRegressor) in scikit-learn.
- Stacking (Stacked generalization)
In this post we have explored various types of ensembles. If you have any questions or suggestions, please do share.