Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added override for hparams in load_from_ckpt #1797

Merged
merged 8 commits into from
May 13, 2020
Merged

added override for hparams in load_from_ckpt #1797

merged 8 commits into from
May 13, 2020

Conversation

williamFalcon
Copy link
Contributor

@williamFalcon williamFalcon commented May 12, 2020

Fixes #1474

Enable the following... (ran into a bunch of issues with transfer learning when moving across clusters)... ugh.

I think @tullie has suggested something like this?

# override some of the params with new values
MyLightningModule.load_from_checkpoint(
    PATH,
    hparam_overrides={'num_layers': 128, 'pretrained_ckpt_path': NEW_PATH}
)

@mergify mergify bot requested a review from a team May 12, 2020 13:10
@justusschock
Copy link
Member

justusschock commented May 12, 2020

This doesn't work, if hparams is a nested dict, right?

Meaning if I have original hparams like

{'a': {'b': ..., 'c': ...}, 'd':...}
I cannot replace only a.b but I'd have to specify a completely, right?

IMO we could think about flattening the hpramas dict on save and reverse this on load to support this, but this would require some more changes...

@williamFalcon
Copy link
Contributor Author

good point. i guess we can support replacing the nested hparams by passing nested updates here? match key by key within the tree

@justusschock
Copy link
Member

Should work and probably is the same effort as having to flatten the dict.
While we're on it: Can tensorboard log nested hparams? If not, we may want to flatten the dict anyway after the network is initialized?

@williamFalcon
Copy link
Contributor Author

you mean in the setter for hparams in the pl module?

@justusschock
Copy link
Member

justusschock commented May 12, 2020

Yes maybe and also save it flattened (then you could acess the entries with something like hparams['a.b'] in the example above) and unflatten it right before model init when laoding from checkpoint...

@williamFalcon
Copy link
Contributor Author

@Borda is this how we get doc tests?

@williamFalcon
Copy link
Contributor Author

@justusschock but if we flatten for the user it might be unexpected no?

  1. maybe a better behavior is that on hparams setter in the module it becomes a namespace so that users don't have to use ['notation'].

  2. for loggers we flatten all of them with the dot notation?

@Borda
Copy link
Member

Borda commented May 12, 2020

@Borda is this how we get doc tests?

What do you mean?

@williamFalcon
Copy link
Contributor Author

I added the >>> on the docs.
Do I need something special to run as doc tests?

@mergify mergify bot requested a review from a team May 12, 2020 17:54
@Borda
Copy link
Member

Borda commented May 12, 2020

I added the >>> on the docs.
Do I need something special to run as doc tests?

it depends in it is python file, you do not need anything extra
if it is the rest file then you do not use >>> but put it to testcode

.. testcode::

    a = 1

then all these are detected automatically and executed as well as other standard tests
cc: @awaelchli

@Borda Borda changed the title added override for hparams in load_from_ckpt [blocked by #1805] added override for hparams in load_from_ckpt May 12, 2020
@Borda Borda added the feature Is an improvement or enhancement label May 12, 2020
@Borda Borda added this to the 0.7.6 milestone May 12, 2020
@Borda Borda changed the title [blocked by #1805] added override for hparams in load_from_ckpt added override for hparams in load_from_ckpt May 12, 2020
@mergify
Copy link
Contributor

mergify bot commented May 13, 2020

This pull request is now in conflict... :(

@codecov
Copy link

codecov bot commented May 13, 2020

Codecov Report

Merging #1797 into master will increase coverage by 0%.
The diff coverage is 91%.

@@          Coverage Diff           @@
##           master   #1797   +/-   ##
======================================
  Coverage      88%     88%           
======================================
  Files          71      71           
  Lines        4394    4404   +10     
======================================
+ Hits         3862    3871    +9     
- Misses        532     533    +1     

Copy link
Contributor

@tullie tullie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@mergify mergify bot requested a review from a team May 13, 2020 03:52
@mergify mergify bot requested a review from a team May 13, 2020 06:44
@Borda Borda requested a review from awaelchli May 13, 2020 07:50
@Borda Borda merged commit 35fe2ef into master May 13, 2020
@Borda Borda deleted the hp branch May 13, 2020 08:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Customizing hparams after loading checkpoint
5 participants