-
Notifications
You must be signed in to change notification settings - Fork 412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Raise LR schedule warnings only when necessary #3207
Conversation
whoops seems some other debugging stuff snuck in here. removing |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
if t_max.unit != max_dur.unit: | ||
# Units are not comparable, so we cannot check if t_max is valid. Log this and return. | ||
log.debug( | ||
f'Since max_duration {max_dur} with units {max_dur.unit} and t_max {t_max} with units {t_max.unit} are not ' | ||
'comparable, make sure that your LR schedule is defined at all points in the training duration.', | ||
) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like this log statement can be very repetitive as it emanates from the __call__
. Should this be part of the __init__
instead ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with Antoine -- the problem described in the issue is less the warning level and more the frequency of logging
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the log was still an error because the durations were equal through the dataloader length even the units were different. The current refactoring solve this with the early returns.
Now that I give it a second though, I'm not sure my proposal of handling it in the __init__
makes sense because the max_duration
can change if the .fit()
if called multiple times.
* compare and exit * compare and exit * revert changes * edge case ://
What does this PR do?
Simple bug fix -- we were raising warnings for LR schedulers when not intended because we were not exiting out of the function in valid cases. Fixes #3205
What issue(s) does this change relate to?
Before submitting
pre-commit
on your change? (see thepre-commit
section of prerequisites)