Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cap batch size by number of training samples when using auto_scale_batch_size #3259

Closed
maxjeblick opened this issue Aug 29, 2020 · 1 comment · Fixed by #3271
Closed

Cap batch size by number of training samples when using auto_scale_batch_size #3259

maxjeblick opened this issue Aug 29, 2020 · 1 comment · Fixed by #3271
Assignees
Labels
bug Something isn't working help wanted Open to be worked on

Comments

@maxjeblick
Copy link
Contributor

🐛 Bug

The batch size finder sets an unrealistically high batch size if all samples of the training dataset fit into one batch.

...
Batch size 8388608 succeeded, trying batch size 16777216
Batch size 16777216 succeeded, trying batch size 33554432
Batch size 33554432 succeeded, trying batch size 67108864
Finished batch size finder, will continue with full run using batch size 67108864

To Reproduce

Steps to reproduce the behavior:

  1. Run Mnist Example with auto_scale_batch_size=True (one needs to remove hardcoded batch size and set self.batch_size).

Expected behavior

Batch size search space should not be larger than number of available training samples.

@maxjeblick maxjeblick added bug Something isn't working help wanted Open to be worked on labels Aug 29, 2020
@github-actions
Copy link
Contributor

Hi! thanks for your contribution!, great first issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants