Skip to content

Commit

Permalink
Fix AMP check tolerance (ultralytics#7937)
Browse files Browse the repository at this point in the history
Adjust to 5%, fixes failing Colab AMP check with V100 (1.5% different) with 200% safety margin.
  • Loading branch information
glenn-jocher authored and Clay Januhowski committed Sep 8, 2022
1 parent 97d4484 commit 003fc3c
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions utils/general.py
Original file line number Diff line number Diff line change
Expand Up @@ -520,10 +520,10 @@ def check_amp(model):
LOGGER.warning(emojis(f'{prefix}checks skipped ⚠️, not online.'))
return True
m = AutoShape(model, verbose=False) # model
a = m(im).xyxy[0] # FP32 inference
a = m(im).xywhn[0] # FP32 inference
m.amp = True
b = m(im).xyxy[0] # AMP inference
if (a.shape == b.shape) and torch.allclose(a, b, atol=1.0): # close to 1.0 pixel bounding box
b = m(im).xywhn[0] # AMP inference
if (a.shape == b.shape) and torch.allclose(a, b, atol=0.05): # close to 5% absolute tolerance
LOGGER.info(emojis(f'{prefix}checks passed ✅'))
return True
else:
Expand Down

0 comments on commit 003fc3c

Please sign in to comment.