Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[hparams] save_hyperparameters doesn't save kwargs #2188

Closed
xiadingZ opened this issue Jun 15, 2020 · 5 comments · Fixed by #2253
Closed

[hparams] save_hyperparameters doesn't save kwargs #2188

xiadingZ opened this issue Jun 15, 2020 · 5 comments · Fixed by #2253
Assignees
Labels
bug Something isn't working help wanted Open to be worked on waiting on author Waiting on user action, correction, or update
Milestone

Comments

@xiadingZ
Copy link

❓ Questions and Help

when I use hyperparemeters like docs:

class LitMNIST(LightningModule):

    def __init__(self, layer_1_dim=128, learning_rate=1e-2, **kwargs):
        super().__init__()
        # call this to save (layer_1_dim=128, learning_rate=1e-4) to the checkpoint
        self.save_hyperparameters()

model checkpoint doesn't save args in kwargs. But kwargs is important. Args such as num_frames, img_size, img_std ... must be used in creating dataloader, but it will be tedious if writes them in __init__ explicitly . it can make code clean if hides them in kwargs.

Before I use hparams, it's ok. But now it's not recommended to use hparams, is there any good idea to deal with this problem?

@xiadingZ xiadingZ added the question Further information is requested label Jun 15, 2020
@xiadingZ
Copy link
Author

xiadingZ commented Jun 15, 2020

if don't use hparams, it will put all args of model, dataset, dataloader... in a LightnModule' s __init__ method, and save_hyperparameters doesn't save args in kwargs. It this really a good idea?

@Borda Borda added bug Something isn't working help wanted Open to be worked on and removed question Further information is requested labels Jun 15, 2020
@Borda
Copy link
Member

Borda commented Jun 15, 2020

Could you please share your model example?

@xiadingZ
Copy link
Author

Could you please share your model example?

class LitMNIST(LightningModule):

    def __init__(self, layer_1_dim=128, learning_rate=1e-2, **kwargs):
        super().__init__()
        # call this to save (layer_1_dim=128, learning_rate=1e-4) to the checkpoint
        self.save_hyperparameters()
        self.kwargs = kwargs
        ...
    
    def train_dataloader(self):
        img_size = self.kwargs['img_size']
        ...

I can train this model, but when I load from checkpoint, it says kwargs hasn't img_size

@Borda
Copy link
Member

Borda commented Jun 16, 2020

I can train this model, but when I load from checkpoint, it says kwargs hasn't img_size

I see, we need to ignore kwargs from the model hparams saving...
@xiadingZ mind adding PR with a test for this case and I ll finish it with a patch?

@Borda Borda added the priority: 0 High priority task label Jun 16, 2020
@Borda Borda self-assigned this Jun 16, 2020
@Borda Borda added this to the 0.8.0 milestone Jun 16, 2020
@edenlightning edenlightning changed the title save_hyperparameters doesn't save kwargs [hparams] save_hyperparameters doesn't save kwargs Jun 17, 2020
@Borda Borda added waiting on author Waiting on user action, correction, or update and removed priority: 0 High priority task labels Jun 18, 2020
@Borda Borda modified the milestones: 0.8.0, 0.8.x Jun 18, 2020
@s-rog
Copy link
Contributor

s-rog commented Jun 19, 2020

Traceback (most recent call last):
  File "./training.py", line 101, in <module>
    main(hparam_trial)
  File "./training.py", line 86, in main
    model = module(hparams, fold_train, fold_val, data_dir+img_dir)
  File "../main/module.py", line 18, in __init__
    self.hparams = hparams
  File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 638, in __setattr__
    object.__setattr__(self, name, value)
  File "/opt/conda/lib/python3.6/site-packages/pytorch_lightning/core/lightning.py", line 1695, in hparams
    self.save_hyperparameters(hp, frame=inspect.currentframe().f_back.f_back)
  File "/opt/conda/lib/python3.6/site-packages/pytorch_lightning/core/lightning.py", line 1662, in save_hyperparameters
    cand_names = [k for k, v in init_args.items() if v == hp]
  File "/opt/conda/lib/python3.6/site-packages/pytorch_lightning/core/lightning.py", line 1662, in <listcomp>
    cand_names = [k for k, v in init_args.items() if v == hp]
  File "/opt/conda/lib/python3.6/site-packages/pandas/core/generic.py", line 1479, in __nonzero__
    f"The truth value of a {type(self).__name__} is ambiguous. "
ValueError: The truth value of a DataFrame is ambiguous. Use a.empty, a.bool(), a.item(), a.any() or a.all().

I'm guessing this is why 0.8.0 causes this error, it's trying to save all args (including dataframes in my case) outside of hparams?

Edit: #2250

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Open to be worked on waiting on author Waiting on user action, correction, or update
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants