-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lightning no longer works with non-primitive types in hparams #1095
Comments
Hi! thanks for your contribution!, great first issue! |
First of all I am not sure if it's good practice to pass in objects like you described and call them "hyperparameters". But that's a different discussion. The |
I agree the base logger should not cast to string, due to the reasons you mentioned. |
Interesting point. @monney could you pls draw a use-case when you need to log a sequence of parameters? moreover range as function won't be seriacable... param = "[1, 3, 5, 9]"
try:
param = eval(param)
except Exception:
pass |
@Borda Current Alternative: passing in a string to decoded in model, something like '[3,5,3,5,3,5,5]' Namespaces: Better organization of parameters Current Alternative: prefix on names like "backbone_init" Functions: Swapping out compatible layers such as BatchNorm and InstanceNorm or changing activation functions. BigGAN's repo is a good example of doing this kind of thing: https://github.com/ajbrock/BigGAN-PyTorch Current Alternative: Mapping within the model like "gn" for to use GroupNorm or "bn" to use BatchNorm I try to make hparams as complete a description of the experiment as possible, so non-primitives are helpful for this, as the alternatives are messier. Decoding: I think it's reasonable to use eval, and assume an appropriate repr for what is passed in. With a warning if something fails. Currently I manually reconstruct hparams, and construct the model from there. Pickling the whole hparams or each non-primitive also seems reasonable to me for full reproducibility. |
Sounds good, would you mind sending a PR with these suggestions? |
🐛 Bug
I will often use things like a layer definition passed in, for things like changing from batch norm to group norm or using a custom layer. Now that lightning uses the hparams feature in tensor board it errors out with these in hparams.
Code sample
Expected behavior
Lightning should convert non primitive types to strings before passing to the summary writer. This worked fine when logging hparams as a text object
Environment
Pytorch 1.4
Ubuntu 16
pip install pytorch_lightning
Python 3.7
The text was updated successfully, but these errors were encountered: