Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Add Reverse Distillation #343

Merged
merged 23 commits into from
Jun 9, 2022

Conversation

ashwinvaidya17
Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 commented Jun 2, 2022

Description

TODO

  • Metrics in README
  • Description of model in README
  • Update model API docs
  • Update model details in docs
  • Add model to all tests
  • Add sample images

Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, good to see this model being added to the repo. The torch model seems to contain some classes that were directly copied from torchvision with some minor changes. Do you think it would be possible to import the models from torchvision instead, and replace the specific parts? It would keep our code base a bit cleaner.

anomalib/models/__init__.py Outdated Show resolved Hide resolved
anomalib/models/reversedistillation/README.md Outdated Show resolved Hide resolved
anomalib/models/reversedistillation/lightning_model.py Outdated Show resolved Hide resolved
anomalib/models/reversedistillation/torch_model.py Outdated Show resolved Hide resolved
tests/pre_merge/deploy/test_inferencer.py Outdated Show resolved Hide resolved
anomalib/models/reversedistillation/config.yaml Outdated Show resolved Hide resolved
anomalib/models/reversedistillation/anomaly_map.py Outdated Show resolved Hide resolved
Comment on lines 56 to 57
inplanes (int): Number of input channels.
planes (int): Number of output channels.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you know why these are called in/out planes. Is it how they are named in the paper?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know why. But the code in RD is taken from resent in torchvision so based on conv parameters, I've added this docstring https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py#L64

anomalib/models/reversedistillation/lightning_model.py Outdated Show resolved Hide resolved
anomalib/models/reversedistillation/torch_model.py Outdated Show resolved Hide resolved
@ashwinvaidya17 ashwinvaidya17 marked this pull request as ready for review June 8, 2022 11:56
Copy link
Contributor

@samet-akcay samet-akcay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only minor stuff...

anomalib/models/__init__.py Outdated Show resolved Hide resolved
anomalib/models/reverse_distillation/anomaly_map.py Outdated Show resolved Hide resolved
anomalib/models/reverse_distillation/anomaly_map.py Outdated Show resolved Hide resolved
Copy link
Contributor

@djdameln djdameln left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Some minor comments

anomalib/models/__init__.py Outdated Show resolved Hide resolved
anomalib/models/__init__.py Outdated Show resolved Hide resolved
anomalib/models/__init__.py Outdated Show resolved Hide resolved
anomalib/models/reverse_distillation/LICENSE Outdated Show resolved Hide resolved
anomalib/models/reverse_distillation/__init__.py Outdated Show resolved Hide resolved
anomalib/models/reverse_distillation/anomaly_map.py Outdated Show resolved Hide resolved
anomalib/models/reverse_distillation/torch_model.py Outdated Show resolved Hide resolved
@@ -68,6 +68,8 @@ def forward(self, images: Tensor) -> Union[Tensor, Tuple[List[Tensor], List[Tens
Union[Tensor, Tuple[List[Tensor],List[Tensor]]]: Encoder and decoder features in training mode,
else anomaly maps.
"""
self.encoder.eval()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are the benchmarking results still the same after this change?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am running it right now. But that was a good catch. I saw that the encoder was in training mode in the train step.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interestingly apart from pixel level score, image level scores are lower
image
image

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The average results also seem to be a bit lower than those reported in the original paper (98.5% image AUROC, 97.8% pixel AUROC). For now I would suggest to merge this PR, but it would be good to investigate if this difference can be explained.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the official implementation give the exact same results reported in the paper?

@samet-akcay samet-akcay merged commit e8af956 into development Jun 9, 2022
@samet-akcay samet-akcay deleted the feature/ashwin/reverse_distillation branch June 9, 2022 14:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Reverse Distillation Model
3 participants