Skip to content

Commit

Permalink
Find out a bug. When set batch_size = -1 to use the autobatch.
Browse files Browse the repository at this point in the history
reproduce:
  • Loading branch information
youyuxiansen authored Dec 10, 2021
1 parent 4fb6dd4 commit 476c824
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion utils/torch_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def select_device(device='', batch_size=None, newline=True):
if cuda:
devices = device.split(',') if device else '0' # range(torch.cuda.device_count()) # i.e. 0,1,6,7
n = len(devices) # device count
if n > 1 and batch_size: # check batch_size is divisible by device_count
if n > 1 and batch_size > 0: # check batch_size is divisible by device_count
assert batch_size % n == 0, f'batch-size {batch_size} not multiple of GPU count {n}'
space = ' ' * (len(s) + 1)
for i, d in enumerate(devices):
Expand Down

1 comment on commit 476c824

@youyuxiansen
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reproduce:
run train.py when set --batch-size = -1

Please sign in to comment.