Public
Authored by Maternus

Feature selection: permutation importance

Reducing the feature count is helpful for many reasons; some are:

  1. better understanding of the core regressors
  2. less computation training the model
  3. supporting feature engineering
  4. ...

An easy way to the check the importance of a feature is called permutation importance - screwing up one feature at a time and computing the relevance of that feature w.r.t. the other feature in means of a accuracy metric.

Edited
permutation_importance.py 480 Bytes
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment