-
-
Notifications
You must be signed in to change notification settings - Fork 16k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
train_batch.jpg labels missing #1623
Comments
Hello @mfruhner, thank you for your interest in 🚀 YOLOv5! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution. If this is a 🐛 Bug Report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you. If this is a custom training ❓ Question, please provide as much information as possible, including dataset images, training logs, screenshots, and a public link to online W&B logging if available. For business inquiries or professional support requests please visit https://www.ultralytics.com or email Glenn Jocher at glenn.jocher@ultralytics.com. RequirementsPython 3.8 or later with all requirements.txt dependencies installed, including $ pip install -r requirements.txt EnvironmentsYOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):
StatusIf this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training (train.py), testing (test.py), inference (detect.py) and export (export.py) on MacOS, Windows, and Ubuntu every 24 hours and on every commit. |
@mfruhner thanks for the bug report! I'd noticed the same behavior recently. The mosaic plotting function has undergone several recent updates, so it may be a bug there, perhaps related to de-normalization of the labels. I will add a TODO here for us to try to reproduce and find a fix. In any case, this is almost certainly a plotting-related issue, as it only appeared within the last several weeks, and the COCO128 labels have never changed. |
TODO: reproduce and fix mosaic plotting bug. |
@mfruhner I can reproduce this issue in a Colab notebook by running your command after running the setup cell. I can see in train_batch2.jpg specifically an image for which many objects have labels, like train, refrigerator, that are not being plotted properly. I will try to figure out why this is happening. |
Ok I think I see the problem. A check is made to determine if the labels are in normalized coordinates before scaling them up to pixels. The check is failing sometimes due to numerical precision issues it seems: True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
True 1.0
False 1.0000001
True 1.0
True 1.0 I think the solution is to introduce an |
Also I should add, that comically the result is that all the boxes are indeed plotted all of the time, but the 'missing' boxes are simply plotted entirely inside the 0,0 pixel. |
i have a dataset.yaml and observe the same effect that no labels are plotted on the mosaic while training. i assume from this issue that one of my 800 .txt files for annotation has an error... but how do i check this? i thought a fix for checking wrong labeling coordinates was already made into yolov5.. my labels.txt: labels.zip ...
|
wanted to crosscheck my yaml and dataset with training on coco128.yaml defaults.
no labels plotting on coco128 also... edit: did a new checkout from git for yolo. did not help. or simply my misunderstanding? but maybe no bug at all - normal? instead only later (after training) "label-names" appear only the validation-mosaic? |
@ozett validation is plotting names and training is plotting class indices. |
Yes, thanks. yolov5 works perfectly and fast. as an ML-baby i still have to take my first steps and gain more experience. neil amstrong reverse, so to say: a giant leap for mankind, but a small step for a ML-baby |
🐛 Bug
Hello, on my custom dataset I noticed some missing labels in random images in train_batch*.jpg images. I then tried it out with coco128 and i noticed the same behaviour. Most labels are correct, but some are just missing. It always appears to be a complete mosaic (or rect for --rect) without labels inside a batch. See the following images with the cake and the dog/elephant:
To Reproduce (REQUIRED)
Download latest docker image.
Run:
python train.py --img 640 --batch 16 --epochs 1 --data coco128.yaml --weights yolov5s.pt --nosave --cache
Output:
Some images are missing their labels.
The questions is also, if it is just a rendering bug, or if the missing labels are contributing to the training processing, resulting in inferior results.
Expected behavior
All images have their correct bounding boxes
Environment
If applicable, add screenshots to help explain your problem.
The text was updated successfully, but these errors were encountered: