Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confused logit callback #118

Merged
merged 24 commits into from
Jul 31, 2020
Merged

Confused logit callback #118

merged 24 commits into from
Jul 31, 2020

Conversation

williamFalcon
Copy link
Contributor

    Takes the logit predictions of a model and when the probabilities of two classes are very close, the model
    doesn't have high certainty that it should pick one vs the other class.

    This callback shows how the input would have to change to swing the model from one label prediction
    to the other.

@pep8speaks
Copy link

pep8speaks commented Jul 15, 2020

Hello @williamFalcon! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-07-31 12:42:27 UTC

@mergify mergify bot requested a review from Borda July 15, 2020 03:04
@codecov
Copy link

codecov bot commented Jul 15, 2020

Codecov Report

Merging #118 into master will decrease coverage by 0.83%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #118      +/-   ##
==========================================
- Coverage   92.25%   91.42%   -0.84%     
==========================================
  Files          79       82       +3     
  Lines        4031     4057      +26     
==========================================
- Hits         3719     3709      -10     
- Misses        312      348      +36     
Flag Coverage Δ
#unittests 91.42% <100.00%> (-0.84%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/callbacks/vision/__init__.py 100.00% <100.00%> (ø)
pl_bolts/callbacks/vision/confused_logit.py 100.00% <100.00%> (ø)
..._bolts/models/self_supervised/moco/moco2_module.py 85.71% <0.00%> (-0.73%) ⬇️
pl_bolts/models/self_supervised/cpc/cpc_module.py 87.50% <0.00%> (-0.53%) ⬇️
.../models/autoencoders/basic_vae/basic_vae_module.py 93.06% <0.00%> (-0.39%) ⬇️
...lts/models/self_supervised/simclr/simclr_module.py 91.26% <0.00%> (-0.27%) ⬇️
pl_bolts/models/gans/basic/basic_gan_module.py 93.02% <0.00%> (-0.16%) ⬇️
pl_bolts/models/vision/image_gpt/igpt_module.py 98.66% <0.00%> (-0.12%) ⬇️
pl_bolts/models/mnist_module.py 100.00% <0.00%> (ø)
pl_bolts/datamodules/__init__.py 100.00% <0.00%> (ø)
... and 7 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9ef90b4...e2dd574. Read the comment docs.

@Borda Borda added the enhancement New feature or request label Jul 15, 2020

mask_idx = mask_idxs[img_i]

fig = plt.figure(figsize=(15, 10))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use rather fig, axarr = plt.subplots(ncol=3, nrow=2)
and alter you just axarr[1, 2].imshow(...)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't have a lot of time to mess with this. i don't know if this will work, but i know the current version works.
If you feel strongly about it, mind making the changes and posting a colab that shows this works?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok I ll check I because the call to plt. takes the last pointer and if you plot two in parallel you will write everything into one

@williamFalcon williamFalcon merged commit 637a532 into master Jul 31, 2020
@Borda Borda deleted the confused branch September 9, 2020 15:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants