Feature selection: permutation importance
Reducing the feature count is helpful for many reasons; some are:
- better understanding of the core regressors
- less computation training the model
- supporting feature engineering
An easy way to the check the importance of a feature is called permutation importance - screwing up one feature at a time and computing the relevance of that feature w.r.t. the other feature in means of a accuracy metric.