Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning rate log callback #1498

Merged
merged 33 commits into from
Apr 30, 2020

Conversation

SkafteNicki
Copy link
Member

@SkafteNicki SkafteNicki commented Apr 15, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Fixes #1205
Implement simple callback, that logs learning rates during training.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team April 15, 2020 09:44
@Borda Borda added the feature Is an improvement or enhancement label Apr 15, 2020
@Borda Borda added this to the 0.7.4 milestone Apr 15, 2020
@mergify mergify bot requested a review from a team April 15, 2020 10:02
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • add test case
  • changelog

@mergify mergify bot requested a review from a team April 15, 2020 10:03
@mergify
Copy link
Contributor

mergify bot commented Apr 15, 2020

This pull request is now in conflict... :(

@codecov
Copy link

codecov bot commented Apr 15, 2020

Codecov Report

Merging #1498 into master will not change coverage.
The diff coverage is n/a.

@@          Coverage Diff           @@
##           master   #1498   +/-   ##
======================================
  Coverage      89%     89%           
======================================
  Files          72      72           
  Lines        4229    4229           
======================================
  Hits         3744    3744           
  Misses        485     485           

Copy link
Member

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Love it!
I have minor comments :)

pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
CHANGELOG.md Outdated Show resolved Hide resolved
Co-Authored-By: Adrian Wälchli <aedu.waelchli@gmail.com>
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🤖

pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
pytorch_lightning/callbacks/lr_logger.py Outdated Show resolved Hide resolved
tests/trainer/test_callbacks.py Outdated Show resolved Hide resolved
tests/trainer/test_callbacks.py Outdated Show resolved Hide resolved
tests/trainer/test_callbacks.py Outdated Show resolved Hide resolved
tests/trainer/test_callbacks.py Outdated Show resolved Hide resolved
tests/trainer/test_callbacks.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team April 16, 2020 21:44
@mergify mergify bot requested a review from a team April 16, 2020 21:54
@mergify mergify bot requested a review from a team April 16, 2020 21:56
@mergify
Copy link
Contributor

mergify bot commented Apr 19, 2020

This pull request is now in conflict... :(

@Borda Borda added the ready PRs ready to be merged label Apr 29, 2020
@williamFalcon williamFalcon merged commit 142bc02 into Lightning-AI:master Apr 30, 2020
@SkafteNicki SkafteNicki deleted the feature/lr_log_callback branch May 25, 2020 08:18
@ivannz
Copy link
Contributor

ivannz commented May 25, 2020

Thank you for this extremely useful callback, @SkafteNicki !

I've noticed that you are using d = dict.fromkeys in lr_logger.py#L62 and later use d[key].append(...) on lr_logger.py#L84-L87. I think L62 is incorrect, since L84-L87 would appends to all lists at once, because the dictionary created at line L62 contains reference to the same list. The following example demonstrates the issue:

names = ['a', 'b', 'c']

d1 = dict.fromkeys(names, [])
d2 = {k: [] for k in names}

d1['a'].append(1)
d1['c'].append(2)

d2['a'].append(1)
d2['c'].append(2)

print(f'd1 = {d1}')
print(f'd2 = {d2}')

prints

d1 = {'a': [1, 2], 'b': [1, 2], 'c': [1, 2]}
d2 = {'a': [1], 'b': [], 'c': [2]}

Contents of d1 do not seem to be the behaviour desired by the logic of ._extract_lr().

@SkafteNicki
Copy link
Member Author

@ivannz good catch, I was not aware that fromkeys should not be used in combination with mutable objects. Could you do a PR to correct this?

@ivannz
Copy link
Contributor

ivannz commented May 25, 2020

@SkafteNicki Thank you, done.

As a sidenote: in my experiments I found it quite useful to disable the misconfiguration check, since it requires an inconvenient branching (to remove LearningRateLogger) before a call to trainer when no scheduler is present.

What was the rationale behind issuing an MisconfigurationException rather than a warning, that would still allow for the trainer to continue?

@SkafteNicki
Copy link
Member Author

@ivannz not really any rational behind, I just assumed that everyone that used this has a scheduler in their model. However, I can clearly see that it is smarter to have a warning instead of an exception. Could you also add this to the PR?

@ivannz
Copy link
Contributor

ivannz commented May 25, 2020

@SkafteNicki done. I am going to patch both in a single PR.

@yitang
Copy link

yitang commented Aug 15, 2020

how to access the LR logged by LearningRateLogger? i couldn't find an example from the docs.

@SkafteNicki
Copy link
Member Author

@yitang the learning rate(s) are stored in the the lrs attribute of the LearningRateLogger

@williamFalcon
Copy link
Contributor

@SkafteNicki is this written in the docs?

@SkafteNicki
Copy link
Member Author

@williamFalcon No, as the (originally) intention of the LearningRateLogger is that the lrs are logged as any other scalar to the trainer.logger, thus the user could interact with the values in that way. However, maybe it could be made clear in the docs that they are also stored in the lrs attribute.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Logging the learning rate
6 participants