Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix normalize mode at confusion matrix (replace nans with zeros) #3465

Merged
merged 9 commits into from
Sep 14, 2020
Merged

Conversation

c00k1ez
Copy link
Contributor

@c00k1ez c00k1ez commented Sep 11, 2020

What does this PR do?

Added same behaviour to confusion_matrix as sklearn.metrics.confusion_matrix with normalization while all predict and target values from same class (nan -> 0). Discussed it with @Borda at Slack.
Also fixes #2724

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team September 11, 2020 18:23
@c00k1ez c00k1ez changed the title fix normalize mode at confusion matrix (replace fans with zeros) fix normalize mode at confusion matrix (replace nans with zeros) Sep 11, 2020
@c00k1ez
Copy link
Contributor Author

c00k1ez commented Sep 11, 2020

10 failing tests(
What am I doing wrong?

@rohitgr7
Copy link
Contributor

Nothing wrong, It's just the PyTorch version I think. Just replace cm.isnan() with torch.isnan(cm).

@codecov
Copy link

codecov bot commented Sep 12, 2020

Codecov Report

Merging #3465 into master will decrease coverage by 3%.
The diff coverage is 100%.

@@           Coverage Diff            @@
##           master   #3465     +/-   ##
========================================
- Coverage      85%     82%     -3%     
========================================
  Files         105     116     +11     
  Lines        8157    9440   +1283     
========================================
+ Hits         6943    7766    +823     
- Misses       1214    1674    +460     

@c00k1ez
Copy link
Contributor Author

c00k1ez commented Sep 12, 2020

Well, It's a bit weird, All tests at tests/test_classification.py and tests/functional/test_classification.py passed successfully at my local machine.

@SkafteNicki
Copy link
Member

The checks are being cancelled because they take too long time (cancelled after 35 min).
From the checks it seems that the last passed test is:
tests/loggers/test_all.py::test_logger_created_on_rank_zero_only[TestTubeLogger]
which means that it is next test that is hanging:
tests/loggers/test_base.py::test_logger_collection

@c00k1ez
Copy link
Contributor Author

c00k1ez commented Sep 12, 2020

As I understand, it is not connected with my changes, right? So, what should i do now?

@SkafteNicki
Copy link
Member

Yes, it does not seem to be related to your PR (sometimes tests just fails)

@awaelchli
Copy link
Contributor

check master, it comes from there :) not your fault @c00k1ez

@c00k1ez
Copy link
Contributor Author

c00k1ez commented Sep 12, 2020

Ok 👌

@c00k1ez
Copy link
Contributor Author

c00k1ez commented Sep 12, 2020

Added fix for Issue 1 from #2724.
@SkafteNicki can you please add reviewers?

tests/metrics/functional/test_classification.py Outdated Show resolved Hide resolved
tests/metrics/test_classification.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team September 12, 2020 13:51
@pep8speaks
Copy link

pep8speaks commented Sep 12, 2020

Hello @c00k1ez! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-09-13 14:00:31 UTC

@c00k1ez c00k1ez requested review from SkafteNicki and removed request for a team September 13, 2020 12:27
@mergify mergify bot requested a review from a team September 13, 2020 12:27
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM, just smaller comments

pytorch_lightning/metrics/functional/classification.py Outdated Show resolved Hide resolved
tests/metrics/test_classification.py Show resolved Hide resolved
@mergify mergify bot requested a review from a team September 13, 2020 13:24
c00k1ez and others added 3 commits September 13, 2020 16:56
add comment to test

Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
Co-authored-by: Nicki Skafte <skaftenicki@gmail.com>
@c00k1ez
Copy link
Contributor Author

c00k1ez commented Sep 13, 2020

Thanks 👍
Resolved all small issues.

@mergify mergify bot requested a review from a team September 13, 2020 15:57
@SkafteNicki SkafteNicki merged commit a552d4a into Lightning-AI:master Sep 14, 2020
@Borda Borda added this to the 0.9.x milestone Sep 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Issues with Confusion Matrix normalization and DDP computation
7 participants