Skip to content

Commit

Permalink
Backward compatibility for checkpoint loading (Lightning-AI#1132)
Browse files Browse the repository at this point in the history
* check if hparams_type exists in checkpoint dictionary for backward compatibility

* concisely maintain backward compatibility for hparams type

* Bug fix in checkpoint loading (Lightning-AI#1132)
  • Loading branch information
amoudgl authored and tullie committed Apr 3, 2020
1 parent c35a6bf commit 5651a25
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
### Fixed

- Fixed bug related to type cheking of `ReduceLROnPlateau` lr schedulers([#1114](https://github.com/PyTorchLightning/pytorch-lightning/issues/1114))
- Fixed a bug to ensure lightning checkpoints to be backward compatible ([#1132](https://github.com/PyTorchLightning/pytorch-lightning/pull/1132))

## [0.7.1] - 2020-03-07

Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -1396,7 +1396,7 @@ def _load_model_state(cls, checkpoint: Dict[str, Any]) -> 'LightningModule':

if cls_takes_hparams:
if ckpt_hparams is not None:
is_namespace = checkpoint.get('hparams_type') == 'namespace'
is_namespace = checkpoint.get('hparams_type', 'namespace') == 'namespace'
hparams = Namespace(**ckpt_hparams) if is_namespace else ckpt_hparams
else:
warnings.warn(
Expand Down

0 comments on commit 5651a25

Please sign in to comment.