Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Yolo boxes format overflowing image and thowing an exception #848

Closed
Trud09 opened this issue Mar 3, 2021 · 6 comments
Closed

Yolo boxes format overflowing image and thowing an exception #848

Trud09 opened this issue Mar 3, 2021 · 6 comments
Labels
bug Something isn't working

Comments

@Trud09
Copy link

Trud09 commented Mar 3, 2021

🐛 Bug

I have some bounding boxes in yolo format that are off the image.

To Reproduce

  1. Have boxes outside image in yolo format and try to augment

\lib\site-packages\albumentations\augmentations\bbox_utils.py", line 330, in check_bbox
"to be in the range [0.0, 1.0], got {value}.".format(bbox=bbox, name=name, value=value)
ValueError: Expected x_max for bbox (0.9967206790123457, 0.5379050925925926, 1.0001929012345678, 0.5529513888888888, 0) to be in the range [0.0, 1.0], got 1.0001929012345678.

Expected behavior

I would prefer it to clip the boxes, or at least give me the option to do so.

Environment

  • Albumentations version (e.g., 0.1.8): 0.5.2
  • Python version (e.g., 3.7): 3.6
  • OS (e.g., Linux): Windows
  • How you installed albumentations (conda, pip, source): pip
  • Any other relevant information:
@ternaus
Copy link
Collaborator

ternaus commented Mar 3, 2021

It is not a bug. It is a feature.

It is much easier to pre-process labels once before model training, rather than catching the challenging corner cases.

Feel free to add one line that clips the bounding boxes to your dataloader :)

@ternaus ternaus closed this as completed Mar 3, 2021
@Trud09
Copy link
Author

Trud09 commented Mar 3, 2021

Actually upon further investigation, there is a bug somewhere.
When I convert the boxes on my own from 'yolo' to 'albumentations' format, it caps at 1, and does not throw an exception. So somewhere it is adding a small amount, maybe in rounding somewhere, causing it to be greater than 1.

@ternaus
Copy link
Collaborator

ternaus commented Mar 3, 2021

Could you please provide an example that reproduces the behavior?

@ternaus ternaus reopened this Mar 3, 2021
@Trud09
Copy link
Author

Trud09 commented Mar 8, 2021

The image might be proprietary, but here is one box causing the issue:

Yolo Format:
[0.99662423, 0.7520255 , 0.00675154, 0.01446759]

My albumentations conversion:
[0.99324846, 0.7447917, 1.0, 0.7592593]

Exception for this box:
Exception has occurred: ValueError
Expected x_max for bbox (0.9934413580246914, 0.7450810185185185, 1.0001929012345678, 0.7592592592592593, 0) to be in the range [0.0, 1.0], got 1.0001929012345678.

My conversion code:

def yolo_to_norm_voc(bboxes):
    x_mean, y_mean, w, h = bboxes[:, 0], bboxes[:, 1], bboxes[:, 2], bboxes[:, 3]

    x_min = x_mean - (w/2)
    y_min = y_mean - (h/2)

    x_min, x_max, y_min, y_max = x_min, x_min + w, y_min, y_min + h
    return np.stack([x_min, y_min, x_max, y_max], 1)

@Dipet Dipet added the bug Something isn't working label Mar 22, 2021
@sang-yc
Copy link

sang-yc commented Apr 1, 2021

Before data augmentation, the length and width of GT bboxes are subtracted by one pixel, which has little effect on the label bounding box, but it can prevent the overflow of values and prevent the generation of labels greater than 1.

labels format: yolo
image.shape[0]: width of image
image.shape[1]: length of image
bboxes: [[x, y, w, h, class], [x, y, w, h, class], ...]

for i in range(len(bboxes)):
    bboxes[i][2] = np.abs(bboxes[i][2] - 0.5 / image.shape[0])
    bboxes[i][3] = np.abs(bboxes[i][3] - 0.5 / image.shape[1])

To further reduce the impact of processing, I only subtracted 0.5 pixels from the code.

@Dipet
Copy link
Collaborator

Dipet commented Jul 7, 2021

Should be fixed by #924

@Dipet Dipet closed this as completed Jul 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants