Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow smaller t_max in schedulers #3141

Open
priba opened this issue Mar 22, 2024 · 8 comments
Open

Allow smaller t_max in schedulers #3141

priba opened this issue Mar 22, 2024 · 8 comments
Labels
bug Something isn't working

Comments

@priba
Copy link
Contributor

priba commented Mar 22, 2024

Problem

The PR #3115 has added some checks to the scheduler. More precisely, the _raise_if_max_duration_exceeds_t_max raise the error:

't_max {t_max} must be greater than or equal to max_duration {max_dur}. Otherwise, the LR schedule will  not be defined for the entire training duration.'

when the duration of the scheduler is less than the max duration of the trainer.

Expected behavior

I don't think this is an error but maybe a warning. Lets say we want to use a liner scheduler only for the first half of the training, I should be able to setup t_max to the expected length which is less than max_dur isn't it?

Additional context

Tagging the reviewer and author of the PR for visibility and get their insights @b-chu @snarayan21

@priba priba added the bug Something isn't working label Mar 22, 2024
@priba priba changed the title Allow smaller t_mux in schedulers Allow smaller t_max in schedulers Mar 22, 2024
@b-chu
Copy link
Contributor

b-chu commented Mar 22, 2024

Thanks for opening an issue! Training behavior would be undefined if you're training on an undefined part of your scheduler so we don't allow that

@priba
Copy link
Contributor Author

priba commented Mar 22, 2024

Training behavior would be undefined if you're training on an undefined part of your scheduler so we don't allow that

well, according to the documentation t_max is the duration of this scheduler, there is no restriction on what should be the duration. This seems to imply that simply this scheduler will not modify the learning rate anymore.

@b-chu
Copy link
Contributor

b-chu commented Mar 22, 2024

Yep, t_max is the duration of the scheduler, so it won't necessarily be defined past t_max. It depends on which exact scheduler you're talking about, but the ones that define past t_max don't raise an error. You can also define your own scheduler that doesn't do that check as well.

@priba
Copy link
Contributor Author

priba commented Mar 27, 2024

For example the LinearWithWarmupScheduler which is defined as: Linearly adjusts the learning rate multiplier from ``alpha_i`` to ``alpha_f`` over ``t_{max}`` time.

@b-chu
Copy link
Contributor

b-chu commented Mar 27, 2024

Unfortunately this error is still valid when the scheduler isn't explicitly defined beyond t_max. Feel free to modify specific schedulers though.

@antoinebrl
Copy link
Contributor

I am well aligned with @priba here. The scheduler is meant to adjust the learning rate value with a specific strategy. If no scheduler is defined, or if the scheduler is not configured pass a certain point, it's pretty clear that the learning rate remains constant.

@antoinebrl
Copy link
Contributor

@mvpatel2000 I would be interested in having your perspective on this matter.

@mvpatel2000
Copy link
Contributor

@antoinebrl could you detail your use case a bit more? Having a concrete scenario for what you are doing that makes it difficult to define a proper schedule would be helpful. We've frequently seen users shoot themselves in the foot, which led to this validation.

I'm not quite sure why defining a full schedule is hard -- maybe we can make that easier? Or if its truly unavoidable, we can consider reverting. In either case, providing some use case that's a real issue would be very helpful to give some motivation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants