-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
auto_scale_batch_size not working with datamodule #3233
Comments
I knew this bug report will eventually come :) same will happen for auto_lr_find.
all of these should be considered by both lr_find and auto_scale_batch_size |
@awaelchli So currently there is no solution to this issue? |
no, you need to set it manually, sorry. |
@awaelchli I completely agree that it should be a simple fix, since all the attribute getting/setting is handled by our own |
@carmocca As you may already know, we fixed this. Just a note in case you didn't see it, you now need to call .tune() instead of .fit(): trainer.tune(lightning_module, datamodule=lightning_data_module) This is to better distinguish the training from the tuning step. However, it may be subject to change since there are some refactors happening right now. |
@awaelchli |
If it stays, in v1.0 |
I am not familiar with the Right now, I can use If I understand correctly, when
Im assuming tune doesnt run fit automatically when it is finished. What would happen if the code ran |
I do not know how to answer these questions, @williamFalcon added the tune method so it would be best to ask him how he sees it being used in the future. |
@carmocca I think the process in the future will be:
it should still be easy for the user to use these features. The moving of the tuning from |
@awaelchli , I have the same issue happening with me, my code is:
I tracked the issue and I think there is a problem in logic from parsing.py Line186 to parsing.py Line193 The problem is that I have a hparams attribute in my model (I don't know where that came from), but it doesn't contain a batch size attribute, the batch_size is an attribute that is contained in the datamodule, if this if condition is executed then I think there would not be a problem |
🐛 Bug
The
Trainer
expects theLightningModule
to haveself.batch_size
(seescale_batch_size()
in training_tricks.py). However, if one is using the newLightningDataModule
, that should be the class withself.batch_size
defined.To Reproduce
Expected behavior
auto_scale_batch_size
should work usingLightningDataModule
Environment
The text was updated successfully, but these errors were encountered: