Skip to content

Commit

Permalink
Fix global_step when gradient accumulation > 1 (#832)
Browse files Browse the repository at this point in the history
  • Loading branch information
Peter Izsak authored Feb 16, 2020
1 parent 4ae31cd commit 27bba1a
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion pytorch_lightning/trainer/training_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -426,7 +426,9 @@ def run_training_epoch(self):
# logs user requested information to logger
self.log_metrics(batch_step_metrics, grad_norm_dic)

self.global_step += 1
# progress global step according to grads progress
if (self.batch_idx + 1) % self.accumulate_grad_batches == 0:
self.global_step += 1
self.total_batch_idx += 1

# end epoch early
Expand Down

0 comments on commit 27bba1a

Please sign in to comment.