Skip to content

Commit

Permalink
TensorRT PyTorch Hub inference fix (ultralytics#7560)
Browse files Browse the repository at this point in the history
Solution proposed in ultralytics#7128 to TRT PyTorch Hub CUDA illegal memory errors.
  • Loading branch information
glenn-jocher authored and Clay Januhowski committed Sep 8, 2022
1 parent 42e3b6f commit 9c01c2d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion models/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -531,7 +531,7 @@ def forward(self, imgs, size=640, augment=False, profile=False):
# multiple: = [Image.open('image1.jpg'), Image.open('image2.jpg'), ...] # list of images

t = [time_sync()]
p = next(self.model.parameters()) if self.pt else torch.zeros(1) # for device and type
p = next(self.model.parameters()) if self.pt else torch.zeros(1, device=self.model.device) # for device, type
autocast = self.amp and (p.device.type != 'cpu') # Automatic Mixed Precision (AMP) inference
if isinstance(imgs, torch.Tensor): # torch
with amp.autocast(autocast):
Expand Down

0 comments on commit 9c01c2d

Please sign in to comment.