Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PatchCore: Unable to reproduce the pixel-wise reported performance #214

Closed
mvidela31 opened this issue Apr 8, 2022 · 5 comments · Fixed by #222
Closed

PatchCore: Unable to reproduce the pixel-wise reported performance #214

mvidela31 opened this issue Apr 8, 2022 · 5 comments · Fixed by #222
Labels
Bug Something isn't working Model

Comments

@mvidela31
Copy link

mvidela31 commented Apr 8, 2022

Hi everybody,

I tried to use the PatchCore model on the MVTec dataset (bottle class), but the pixel-wise performance was well bellow the reported. I used the default config file (setting the seed to 42):

DATALOADER:0 TEST RESULTS
{'image_AUROC': 1.0,
 'image_F1': 1.0,
 'pixel_AUROC': 0.8852562308311462,
 'pixel_F1': 0.49169835448265076}

However, using the PaDiM model I was able to reproduce the reported benchmark performance:

DATALOADER:0 TEST RESULTS
{'image_AUROC': 0.9936507940292358,
 'image_F1': 0.9763779044151306,
 'pixel_AUROC': 0.9830061197280884,
 'pixel_F1': 0.7220250964164734}

Is this unexpected result due to a bug or am I missing something?

@mvidela31 mvidela31 changed the title PathCore: Unable to reproduce the pixel-wise reported performance PatchCore: Unable to reproduce the pixel-wise reported performance Apr 8, 2022
@samet-akcay
Copy link
Contributor

Hi @mvidela31, this must be related to one of our recent PRs. We'll investigate this.

@samet-akcay samet-akcay added Model Bug Something isn't working labels Apr 8, 2022
@alexriedel1
Copy link
Contributor

Hi @mvidela31, this must be related to one of our recent PRs. We'll investigate this.

What I can say is, that for some reason the feature map shape of the wide_resnet50_2 now is 32,32 (was 28,28 some time ago) on 224x224 images

@alexriedel1
Copy link
Contributor

2dfa0a7
This was when apply_tiling was set True on default for patchcore. It's the reason why the pixel wise F1 drops signifacantly.

@samet-akcay
Copy link
Contributor

thanks @alexriedel1, nice find! It was probably set to True by mistake. Our nightly tests would have picked this, but unfortunately it's broken now.

@mvidela31, as suggested by @alexriedel1, you could set apply_tiling: false in dataset section of the anomalib/models/patchcore/config.yaml file.

@mvidela31
Copy link
Author

It worked, thank you all!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working Model
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants