Skip to content

Commit

Permalink
training batch clean up
Browse files Browse the repository at this point in the history
  • Loading branch information
williamFalcon committed Jun 8, 2020
1 parent 8368b03 commit 6fde56a
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions pytorch_lightning/trainer/distrib_data_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -427,7 +427,13 @@ def ddp_train(self, process_idx, model, is_master=False, proc_offset=0):
# try to init for 20 times at max in case ports are taken
# where to store ip_table
model.trainer = self
print('-'*100)
print('starting ddp')
print('-'*100)
model.init_ddp_connection(self.proc_rank, self.world_size, self.is_slurm_managing_tasks)
print('-'*100)
print('ddp started')
print('-'*100)

# CHOOSE OPTIMIZER
# allow for lr schedulers as well
Expand Down

0 comments on commit 6fde56a

Please sign in to comment.