Skip to content

Commit

Permalink
fix set_epoch on TPUs (Lightning-AI#2740)
Browse files Browse the repository at this point in the history
* fix Lightning-AI#2622

* Update training_loop.py
  • Loading branch information
ibeltagy committed Aug 7, 2020
1 parent f82d7fe commit 2cc60c6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pytorch_lightning/trainer/training_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -369,7 +369,7 @@ def train(self):
if self.reload_dataloaders_every_epoch:
self.reset_train_dataloader(model)
# set seed for distributed sampler (enables shuffling for each epoch)
if (self.use_ddp or self.use_horovod) \
if (self.use_ddp or self.use_horovod or self.on_tpu) \
and hasattr(self.train_dataloader, 'sampler') \
and hasattr(self.train_dataloader.sampler, 'set_epoch'):
self.train_dataloader.sampler.set_epoch(epoch)
Expand Down

0 comments on commit 2cc60c6

Please sign in to comment.