-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for logging hydra's hparams to TensorBoard. #1417
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1417 +/- ##
=======================================
- Coverage 92% 92% -<1%
=======================================
Files 63 63
Lines 3384 3389 +5
=======================================
+ Hits 3104 3107 +3
- Misses 280 282 +2 |
Hello @lkhphuc! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-04-22 13:49:19 UTC |
changelog need to be rebased on new release #1419 |
Will do when it's merged. |
This pull request is now in conflict... :( |
I recommend against converting the OmegaConf object to dict unless there is absolutely no choice. For example, interpolation and mutation validation (The second will become a much bigger thing in the upcoming OmegaConf 2.0 and Hydra 1.0, with the introduction of Structured configs). |
If there's a way to use Hydra without the ugly ** keywords arguments expansion I would definitely use it, but as I tried up to now this is the only way that works.
As these are only trainer's flag and handled by Lightning, I don't see any lost benefit for user at all. User can still use all the features provided by Hydra when pass in the config to their own |
Have you tried this?
This is only partially true: you will get inferior error reporting for code or configuration errors. |
requirements-extra.txt
Outdated
matplotlib>=3.1.1 | ||
hydra_core >=0.11.3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please pin to 0.11.3 for now.
The next version (1.0.0) will break some compatibility and it would be good to test it before you allow it.
One of the most important changes is that the supported Python versions are changing from >= 2.7
to >=3.6
.
I will have a release candidate for Hydra 1.0 in a few weeks, at that point, after a test - you can relax the restriction.
Hi @Borda , I rebased on master and pinned the hydra version for now as suggested. This is a very simple PR, instead of passing a Namespace to Module's hparams, you can pass a DictConfig and everything from Lightining still works fine (this PR fixes the last problem of Tensorboard logging hyper-parameters). This is mostly for my own use case so I'm not sure if it's aligned with Lightning roadmap in the future or not, just think this patch might help. |
@williamFalcon what do you say about adding this dependence? |
This pull request is now in conflict... :( |
@williamFalcon can you give your opinion on adding this extra dependency? |
I don't think it makes sense.
|
As I said earlier, I would be happy to chat with the PytorchLightning leads about what a proper integration might look like. |
hi @omry ping me on lightning slack or pt slack? |
sent you a direct message in the lightning slack. |
Adding |
yeah, no hydra dep. I need to think about it this week but we should make it easy to use hydra. |
Before submitting
What does this PR do?
As discussed in #807, currently I can incorporate Hydra with Pytorch-Lightning easily. The only thing that's missing is hparams logging to Tensorboard.
Usage becomes:
For this PR I think no document is needed, but if needed I can write up some minimal examples to use Hydra with Lightning.