Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Precision, Recall and mAP seems incoherent #80

Closed
yusiyoh opened this issue Aug 23, 2021 · 4 comments
Closed

Precision, Recall and mAP seems incoherent #80

yusiyoh opened this issue Aug 23, 2021 · 4 comments

Comments

@yusiyoh
Copy link

yusiyoh commented Aug 23, 2021

Hello,

I am training YOLOR-D6, I obtain precision, recall and mAP results from test.py with following command using paper branch:
python3 test.py --weights ./runs/train/yolor-d6-1280size-multiGPU/weights/best.pt --img 1280 --verbose --data data/dtld_test.yaml --batch 32 --task test --conf 0.4 --iou 0.5

Here is the result:
image
I think the mAP0.5 is very high with those recall values. Is there a mistake in metrics.py? Or do I have to run test.py with different options?
Thank you.

@yusiyoh
Copy link
Author

yusiyoh commented Aug 24, 2021

Yet another strange thing is that when I reduce conf-thres the mAP decreases significantly instead of increasing ( It is expected to increase with reducing conf-thres).
The first one is with default conf-thres (0.001):
image
Second one is with 0.4 conf-thres:
image

@yusiyoh
Copy link
Author

yusiyoh commented Aug 24, 2021

I have changed the metrics.py according to https://github.com/ultralytics/yolov5/blob/master/utils/metrics.py and the mAP are very different.
Default YOLOR:
image
YOLOV5 metrics:
image

Which one is wrong and why?

@yusiyoh
Copy link
Author

yusiyoh commented Aug 27, 2021

I also open an issue in YOLOv5 repo: ultralytics/yolov5#4546
Then, @glenn-jocher changed the metrics.py according to my suggestion: ultralytics/yolov5#4563
Could you please check them and consider changing the following lines:

yolor/utils/metrics.py

Lines 125 to 126 in 2fa3a31

mrec = recall # np.concatenate(([0.], recall, [recall[-1] + 1E-3]))
mpre = precision # np.concatenate(([0.], precision, [0.]))

To new metrics.py from YOLOv5: https://github.com/ultralytics/yolov5/blob/8b18b66304317276f4bfc7cc7741bd535dc5fa7a/utils/metrics.py#L94-L95

Thank you

@WongKinYiu
Copy link
Owner

thanks, i change these two lines with your suggestion, but i think two metrics.py will still output different results.

@yusiyoh yusiyoh closed this as completed Aug 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants