Skip to content

Feature Importance via attention weights

Compare
Choose a tag to compare
@jrzaurin jrzaurin released this 21 Jul 07:37
· 193 commits to master since this release
4af577a
  • Added a new functionality to access feature importance via attention weights for all DL models for Tabular data except for the TabPerceiver. This functionality is accessed via the feature_importance attribute in the trainer (computed during training with a sample of observations) and at predict time via de explain method.
  • Fix all restore weights capabilities in all forms of training. Such capabilities are present in two callbacks, the EarlyStopping and the ModelCheckpoint Callbacks. Prior to this release there was a bug and the weights were not restored.