Skip to content

Commit

Permalink
zero-mAP fix return .detach() to EMA
Browse files Browse the repository at this point in the history
Resolves ultralytics/hub#82

Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
  • Loading branch information
glenn-jocher committed Aug 21, 2022
1 parent 93f63ee commit af17e42
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion utils/torch_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -422,7 +422,7 @@ def update(self, model):
for k, v in self.ema.state_dict().items():
if v.dtype.is_floating_point: # true for FP16 and FP32
v *= d
v += (1 - d) * msd[k]
v += (1 - d) * msd[k].detach()
assert v.dtype == msd[k].dtype == torch.float32, f'EMA {v.dtype} and model {msd[k]} must be updated in FP32'

def update_attr(self, model, include=(), exclude=('process_group', 'reducer')):
Expand Down

2 comments on commit af17e42

@rexayyy
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Screen Shot 2022-08-21 at 6 26 48 AM
I am still getting the issue

@glenn-jocher
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rexayyy being worked on in #9059

Please sign in to comment.