-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
set logger level for package #1718
set logger level for package #1718
Conversation
tests/test_profiler.py
Outdated
@@ -165,6 +167,8 @@ def test_advanced_profiler_describe(tmpdir, advanced_profiler): | |||
""" | |||
ensure the profiler won't fail when reporting the summary | |||
""" | |||
trainer = Trainer() # logging is set up when a Trainer object is initialized |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what about model or any other tool except Trainer?
@jeremyjordan will logging before the Trainer init still work, say when we would setup callbacks, loggers etc? We would just need to make sure to import Trainer before all of that, right? |
the other option which i presented here was to only add a stream handler for the lightning logger. i don't have a strong opinion regarding which way we handle it and am happy to switch this out for the alternative. |
This pull request is now in conflict... :( |
41676ef
to
c09f1f8
Compare
Codecov Report
@@ Coverage Diff @@
## master #1718 +/- ##
======================================
Coverage 88% 88%
======================================
Files 69 69
Lines 4304 4306 +2
======================================
+ Hits 3794 3796 +2
Misses 510 510 |
@tgaddair this is also just caching/pip issue?
|
No, in this case it looks like something (possibly a race condition) is causing rank 1 to fail:
|
but |
My suspicion is that it's hitting this code path when loading the model: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/core/lightning.py#L1545 That suggests that the hparams haven't been updated in the checkpoint yet, which may be a race condition (rank 1 gets to this point before rank 0 has had a chance to write out the hparams). I'll look into this further and hopefully put together a quick PR. |
cool, Thx, as you know sooner, better because it causes difficult time for all PRs :] |
I haven't been able to repro the issue locally, but my best guess at this point is still that there's a race condition, in which case #1786 should be able to address it. |
Neither me and also CircleCI does not suffer from this issue at all so I guess that it is somehow connected with GH action... |
Before submitting
What does this PR do?
Fixes #1503
If a user isn't setting up any logging config on their own, we still do it for them.
If they want to set up their own logging config, this approach is less obtrusive.
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃