Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

update batch size in LightningModule.datamodule when auto scaling batch size #3266

Merged
merged 16 commits into from
Sep 3, 2020

Conversation

awaelchli
Copy link
Member

@awaelchli awaelchli commented Aug 30, 2020

What does this PR do?

Fixes #3233

the new test fails on master

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 馃檭

@awaelchli awaelchli added the bug Something isn't working label Aug 30, 2020
@awaelchli awaelchli changed the title update batch size on datamodules when auto scaling batch size update batch size in LightningModule.datamodule when auto scaling batch size Aug 30, 2020
@codecov
Copy link

codecov bot commented Aug 30, 2020

Codecov Report

Merging #3266 into master will decrease coverage by 4%.
The diff coverage is 71%.

@@           Coverage Diff           @@
##           master   #3266    +/-   ##
=======================================
- Coverage      90%     86%    -4%     
=======================================
  Files          90      91     +1     
  Lines        8158    8700   +542     
=======================================
+ Hits         7362    7499   +137     
- Misses        796    1201   +405     

@awaelchli awaelchli marked this pull request as ready for review August 30, 2020 15:35
@mergify mergify bot requested a review from a team August 30, 2020 15:35
@mergify
Copy link
Contributor

mergify bot commented Aug 31, 2020

This pull request is now in conflict... :(

@mergify mergify bot requested a review from a team September 1, 2020 06:47
Comment on lines 286 to 287
if trainer.datamodule is not None and hasattr(trainer.datamodule, batch_arg_name):
setattr(trainer.datamodule, batch_arg_name, new_size)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this necessary, should lightning_setattr not take care of this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you mean accessing it through model.trainer.datamodule? Ok, I'll try that

@mergify mergify bot requested a review from a team September 1, 2020 08:16
@@ -186,15 +186,17 @@ def lightning_hasattr(model, attribute):
attr = attribute in model.hparams
else:
attr = hasattr(model.hparams, attribute)
elif hasattr(model.datamodule, attribute):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a bit confusing regarding the line above... mind adding a comment what is this case about...

@mergify mergify bot requested a review from a team September 1, 2020 08:34
@mergify mergify bot requested a review from a team September 1, 2020 08:35
@mergify mergify bot requested a review from a team September 1, 2020 08:36
@mergify mergify bot requested a review from a team September 3, 2020 09:29
@awaelchli awaelchli changed the title update batch size in LightningModule.datamodule when auto scaling batch size [WIP] update batch size in LightningModule.datamodule when auto scaling batch size Sep 3, 2020
@awaelchli awaelchli changed the title [WIP] update batch size in LightningModule.datamodule when auto scaling batch size update batch size in LightningModule.datamodule when auto scaling batch size Sep 3, 2020
Copy link
Contributor

@rohitgr7 rohitgr7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

pytorch_lightning/trainer/training_tricks.py Outdated Show resolved Hide resolved
pytorch_lightning/trainer/training_tricks.py Outdated Show resolved Hide resolved
pytorch_lightning/utilities/parsing.py Outdated Show resolved Hide resolved
pytorch_lightning/utilities/parsing.py Outdated Show resolved Hide resolved
pytorch_lightning/utilities/parsing.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team September 3, 2020 19:21
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
@mergify mergify bot requested a review from a team September 3, 2020 19:30
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@awaelchli awaelchli merged commit 48c22c8 into master Sep 3, 2020
@awaelchli awaelchli deleted the bugfix/dm-batchfinder branch September 3, 2020 20:07
@awaelchli awaelchli mentioned this pull request Sep 7, 2020
7 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

auto_scale_batch_size not working with datamodule
6 participants