Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make auto_scale_batch_size error message more informative #15276

Closed
PaulLerner opened this issue Oct 24, 2022 · 3 comments · Fixed by #15351
Closed

Make auto_scale_batch_size error message more informative #15276

PaulLerner opened this issue Oct 24, 2022 · 3 comments · Fixed by #15351
Labels
feature Is an improvement or enhancement good first issue Good for newcomers tuner
Milestone

Comments

@PaulLerner
Copy link
Contributor

PaulLerner commented Oct 24, 2022

📚 Documentation

The issue

Hi,
From #3233, I understand that the current error message for auto_scale_batch_size misconfiguration message dates back from a time where LightningDataModule did not exist. The message is still the same today: "Field {self._batch_arg_name} not found in both model and model.hparams". Typically "Field batch_size..."

This message is misleading. I thought that I needed to add a batch_size attribute to my model because the tuning would be independent of data module or something.

How to fix

I’m new to lightning so I’m not sure how to phrase this (that’s why I’m opening an issue instead of a PR). How about: f"Field {self._batch_arg_name} not found in {model.__class__.__name__}, datamodule.__class__.__name__, nor their hparams attributes." ?

@PaulLerner PaulLerner added the needs triage Waiting to be triaged by maintainers label Oct 24, 2022
@PaulLerner
Copy link
Contributor Author

Update: looking at https://github.com/Lightning-AI/lightning/blob/56ca4aafee73403c82811b37d2b9f83e48e5d12e/src/pytorch_lightning/utilities/parsing.py#L306 it seems tricky to access the datamodule class name. Perhaps a more generic message like f"Field {self._batch_arg_name} not found in model, datamodule, nor their hparams attributes." is sufficient.

@awaelchli
Copy link
Contributor

@PaulLerner Your proposal seems fine to me. Let's go for the simple solution first, which should already be an improvement and we can think of ways to make it even better.
Would you be interested to work on it?

@awaelchli awaelchli added feature Is an improvement or enhancement good first issue Good for newcomers tuner and removed needs triage Waiting to be triaged by maintainers labels Oct 24, 2022
@awaelchli awaelchli added this to the v1.9 milestone Oct 24, 2022
@PaulLerner
Copy link
Contributor Author

Yes I can open a PR!
While I’m at it, maybe I should also update the docstring of Trainer. It states, for auto_scale_batch_size: "The result will be stored in self.batch_size in the LightningModule" but it is actually done with lightning_setattr. So the correct doc should state: "The result will be stored in self.batch_size in the LightningModule or LightningDataModule depending on your setup." ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement good first issue Good for newcomers tuner
Projects
None yet
3 participants