In this post, let us explore:
- What is feature selection?
- Why we need to perform feature selection?
- Methods
What is Feature Selection?
Feature selection means selecting and retaining only the most important features in the model. Feature selection is different from feature extraction. In feature selection, we subset the features whereas in feature extraction, we create a new feature from the existing features.
Why Feature Selection is important?
- It simplifies the model: data reduction, less storage, Occam’s razor and better visualization
- Reduces training time
- Avoids over-fitting
- Improves accuracy of the model
- Avoids curse of dimensionality.
Methods
Feature selection methods can be grouped into three categories: filter method, wrapper method and embedded method.

Three methods of feature selection
- Filter method
- Wrapper method
- Embedded method

Three feature selection methods in simple words
The following graphic shows the popular examples for each of these three feature selection methods.

Examples for three methods of feature selection
In the following table, let us explore the comparison of these three methods of feature selection.

Comparison of three methods of feature selection
Summary
Filter method is faster and useful when there are more number of features. Wrapper method gives better performance while the embedded method lies in between the other two methods.
References
https://towardsdatascience.com/the-5-feature-selection-algorithms-every-data-scientist-need-to-know-3a6b566efd2