Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Extractor Refactor #451

Merged
merged 16 commits into from
Aug 11, 2022

Conversation

ashishbdatta
Copy link
Contributor

Description

  • This PR changes the default behavior of FeatureExtractor class to leverage models from timm rather than torchvision. By using timm we can easily access the features we care about rather than relying on forward_hooks

Changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes

@ashishbdatta
Copy link
Contributor Author

ashishbdatta commented Jul 24, 2022

I made an assumption here that the layers will follow the format of layer<num_of_layer> like here since so many configs use this convention

But I can see the DFKDE model uses the key avgpool. What is the preference here? to support anyname or restrict the config to layer<num_of_layer> convention?

@ashishbdatta
Copy link
Contributor Author

Please let me know if this looks good @samet-akcay

@samet-akcay
Copy link
Contributor

Please let me know if this looks good @samet-akcay

@ashishbdatta, there will probably be merge conflicts due to #453. I'll review it anyway, and let you know

@ashwinvaidya17
Copy link
Collaborator

I made an assumption here that the layers will follow the format of layer<num_of_layer> like here since so many configs use this convention

But I can see the DFKDE model uses the key avgpool. What is the preference here? to support anyname or restrict the config to layer<num_of_layer> convention?

I'd be in the favour of supporting any layer but it might be tricky with this new design as timm requires layer indices. Because the model isn't initialized, it is hard to know the layer id which corresponds to the name. Here is an idea - Instead of splitting based on layer maybe we can update the config to reflect

 layers:
    - block 1: 1
    - block 2: 2
    - block 3: 3
 layers:
    - avgpool: 4

Since we only need the id, the layer key here might be unnecessary, it would make it readable for the users. As this will be a breaking change we can club this with the CLI upgrade and make it part of the 0.4.x release.

Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with Ashwin that it would be good to support any layer. Maybe we could write a function that maps the layer name to out_index used by timm.create_model.

@ashishbdatta
Copy link
Contributor Author

We could do something like

def _map_layer_to_idx(self) -> List[int]:
      idx = []
      # Create temp model to search child nodes
      features = timm.create_model(
                        backbone,
                        pretrained,
                        features_only,
                        )
      for i in self.layers
           # Search children nodes and return index they are found at
           idx.append(list(dict(features.named_children()).keys()).index(i))
      return idx

Only problem is that this would return indices of [4, 5, 6] whereas we need indices of [1, 2, 3] because timm for some reason ignores non-nn.Sequential blocks for selecting indices.

For e.g., in a Resnet18 model they ignore conv1, bn1, maxpool1 in their indexing.

Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tests should pass with these changes.

anomalib/models/dfkde/config.yaml Outdated Show resolved Hide resolved
anomalib/models/dfkde/torch_model.py Outdated Show resolved Hide resolved
configs/model/dfkde.yaml Outdated Show resolved Hide resolved
Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for migrating the feature extractor.

@ashishbdatta ashishbdatta changed the title WIP: Feature Extractor Refactor Feature Extractor Refactor Aug 10, 2022
@samet-akcay
Copy link
Contributor

Thanks @ashishbdatta. Looking good!

@samet-akcay samet-akcay merged commit 27a31cd into openvinotoolkit:main Aug 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants