Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Start accumulate gradients schedule at epoch 0 (continued) #2513

Merged
merged 7 commits into from
Jul 9, 2020
Merged

Start accumulate gradients schedule at epoch 0 (continued) #2513

merged 7 commits into from
Jul 9, 2020

Conversation

HHousen
Copy link
Contributor

@HHousen HHousen commented Jul 5, 2020

What does this PR do?

Continuation of #2490.

Fixes #2480. If the pl.Trainer option accumulate_grad_batches was an integer then the first epoch (epoch 0) would use accumulate_grad_batches=1 while the remaining epochs would use accumulate_grad_batches=<user_value>. This was caused by #2289.

Before:
wrong

After:
correct

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team July 5, 2020 14:07
@codecov
Copy link

codecov bot commented Jul 5, 2020

Codecov Report

Merging #2513 into master will decrease coverage by 1%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #2513   +/-   ##
======================================
- Coverage      88%     87%   -1%     
======================================
  Files          69      69           
  Lines        5540    5628   +88     
======================================
- Hits         4900    4899    -1     
- Misses        640     729   +89     

Copy link
Contributor

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice, this looks good!

tests/trainer/test_trainer.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team July 5, 2020 21:59
@pep8speaks
Copy link

pep8speaks commented Jul 5, 2020

Hello @HHousen! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-07-06 00:24:32 UTC

@williamFalcon williamFalcon merged commit 992a7e2 into Lightning-AI:master Jul 9, 2020
@Borda Borda added the bug Something isn't working label Jul 9, 2020
@Borda Borda added this to the 0.8.x milestone Jul 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

For versions >0.8.2 learning rate is zero for last epoch (potentially a logging bug)
5 participants