Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logging the learning rate #1205

Closed
maxime-louis opened this issue Mar 21, 2020 · 7 comments · Fixed by #1498
Closed

Logging the learning rate #1205

maxime-louis opened this issue Mar 21, 2020 · 7 comments · Fixed by #1498
Labels
discussion In a discussion stage feature Is an improvement or enhancement help wanted Open to be worked on

Comments

@maxime-louis
Copy link

Hey,

I think it would a cool feature to add a flag enabling the logging of the learning rate(s).

Thanks for your amazing work !

@maxime-louis maxime-louis added feature Is an improvement or enhancement help wanted Open to be worked on labels Mar 21, 2020
@github-actions
Copy link
Contributor

Hi! thanks for your contribution!, great first issue!

@awaelchli
Copy link
Member

I think that's a great idea. Maybe it doesn't have to be a flag, it could be done by default like the other metrics that are already plotted automatically.

Some things to consider:

  • How would it work for optimizers like Adam?
  • Optimizers may have different learning rates for different param groups

@maxime-louis
Copy link
Author

For Adam it's a pickle. Maybe rather log the scheduler information i.e. its scaling of the initial learning rate. It would solve the group problems as well I guess.

@Borda
Copy link
Member

Borda commented Apr 9, 2020

@PyTorchLightning/core-contributors do we want to add extra logging for LR or just stay with logging these extra parameters as a metric...?

@Borda Borda added the discussion In a discussion stage label Apr 9, 2020
@Ir1d
Copy link
Contributor

Ir1d commented Apr 11, 2020

I proposed adding a Trainer.lr in #1003, but we decided to use the callbacks then.

@justusschock
Copy link
Member

I'd also stick to callbacks.
The simplest approach to train a network doesn't even include lr changes and it does not make any sense to log something that doesn't change by design.

However, for convenience we could provide an implementation of such a callback

@SkafteNicki
Copy link
Member

I have earlier implemented a callback that logs the learning rate for some experiments I did. I can bring it up to date and create a PR if wanted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion In a discussion stage feature Is an improvement or enhancement help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants