Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support calculate confusion matrix and plot it. #1287

Merged
merged 7 commits into from
Feb 14, 2023

Conversation

mzr1996
Copy link
Member

@mzr1996 mzr1996 commented Jan 3, 2023

Modification

Add the ConfusionMatrix metric and a command-line tool to calculate/plot the confusion matrix.

Use cases (Optional)

1. Basic Usage

>>> import torch
>>> from mmcls.evaluation import ConfusionMatrix
>>> y_pred = [0, 1, 1, 3]
>>> y_true = [0, 2, 1, 3]
>>> ConfusionMatrix.calculate(y_pred, y_true, num_classes=4)
tensor([[1, 0, 0, 0],
        [0, 1, 0, 0],
        [0, 1, 0, 0],
        [0, 0, 0, 1]])
>>> # plot the confusion matrix
>>> import matplotlib.pyplot as plt
>>> y_score = torch.rand((1000, 10))
>>> y_true = torch.randint(10, (1000, ))
>>> matrix = ConfusionMatrix.calculate(y_score, y_true)
>>> ConfusionMatrix().plot(matrix)
>>> plt.show()

2. Use with Evalutor

>>> import torch
>>> from mmcls.evaluation import ConfusionMatrix
>>> from mmcls.structures import ClsDataSample
>>> from mmengine.evaluator import Evaluator
>>> data_samples = [
...     ClsDataSample().set_gt_label(i%5).set_pred_score(torch.rand(5))
...     for i in range(1000)
... ]
>>> evaluator = Evaluator(metrics=ConfusionMatrix())
>>> evaluator.process(data_samples)
>>> evaluator.evaluate(1000)
{'confusion_matrix/result': tensor([[37, 37, 48, 43, 35],
         [35, 51, 32, 46, 36],
         [45, 28, 39, 42, 46],
         [42, 40, 40, 35, 43],
         [40, 39, 41, 37, 43]])}

3. Use command-line tool

python tools/analysis_tools/confusion_matrix.py \
    configs/resnet/resnet50_8xb16_cifar10.py \
    https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_b16x8_cifar10_20210528-f54bfad9.pth \
    --show

图片

Checklist

Before PR:

  • Pre-commit or other linting tools are used to fix the potential lint issues.
  • Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
  • The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, like docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with those projects, like MMDet or MMSeg.
  • CLA has been signed and all committers have signed the CLA in this PR.

@codecov
Copy link

codecov bot commented Jan 3, 2023

Codecov Report

Base: 0.02% // Head: 86.88% // Increases project coverage by +86.86% 🎉

Coverage data is based on head (7e32977) compared to base (b8b31e9).
Patch has no changes to coverable lines.

❗ Current head 7e32977 differs from pull request most recent head 509250a. Consider uploading reports for the commit 509250a to get more accurate results

Additional details and impacted files
@@             Coverage Diff              @@
##           dev-1.x    #1287       +/-   ##
============================================
+ Coverage     0.02%   86.88%   +86.86%     
============================================
  Files          121      166       +45     
  Lines         8217    13580     +5363     
  Branches      1368     2158      +790     
============================================
+ Hits             2    11799    +11797     
+ Misses        8215     1422     -6793     
- Partials         0      359      +359     
Flag Coverage Δ
unittests 86.88% <ø> (+86.86%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmcls/datasets/transforms/compose.py
mmcls/models/utils/norm.py 80.00% <0.00%> (ø)
mmcls/models/heads/efficientformer_head.py 93.10% <0.00%> (ø)
mmcls/models/backbones/beit.py 57.06% <0.00%> (ø)
mmcls/models/classifiers/timm.py 25.97% <0.00%> (ø)
mmcls/models/backbones/edgenext.py 95.20% <0.00%> (ø)
mmcls/models/retrievers/__init__.py 100.00% <0.00%> (ø)
mmcls/apis/model.py 87.09% <0.00%> (ø)
mmcls/datasets/inshop.py 100.00% <0.00%> (ø)
mmcls/models/backbones/vig.py 18.85% <0.00%> (ø)
... and 157 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

mmcls/evaluation/metrics/single_label.py Outdated Show resolved Hide resolved
cmap: str = 'viridis',
classes: Optional[List[str]] = None,
colorbar: bool = True,
show: bool = True):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer to add a save path parameter here so that the result image can be saved on the command line when using linux

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The method will output the figure, users can save the figure manually if they want.

@lb-hit
Copy link

lb-hit commented Feb 9, 2023

Hi, about the third method: use the command line tool. I can't find confusion_matrix.py in Analysis_tools. @mzr1996

@mzr1996
Copy link
Member Author

mzr1996 commented Feb 9, 2023

Hi, about the third method: use the command line tool. I can't find confusion_matrix.py in Analysis_tools. @mzr1996

I have moved it to the analysis_tools folder, you can find it now.

Copy link
Collaborator

@Ezra-Yu Ezra-Yu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@lb-hit
Copy link

lb-hit commented Feb 10, 2023

Hi, the github codebase is not updated. Another problem is, I used the latest version of mmcla, when I run confusion_matrix.py, an error occured--cannot import name 'ConfusionMatrix' from 'mmcls.evaluation'. I found that 'ConfusionMatrix' is not defined, is there a script for 'ConfusionMatrix'? @mzr1996

@mzr1996 mzr1996 merged commit b4ee9d2 into open-mmlab:dev-1.x Feb 14, 2023
@mzr1996
Copy link
Member Author

mzr1996 commented Feb 14, 2023

Hi, the github codebase is not updated. Another problem is, I used the latest version of mmcla, when I run confusion_matrix.py, an error occured--cannot import name 'ConfusionMatrix' from 'mmcls.evaluation'. I found that 'ConfusionMatrix' is not defined, is there a script for 'ConfusionMatrix'? @mzr1996

I think you can pull the latest dev-1.x branch and use it now.

@lb-hit
Copy link

lb-hit commented Feb 27, 2023

@mzr1996 hi , when I used the third way to get confusion matrix, the following error occured:
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants