Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Mixing hparams and arguments in LightningModule #1505

Merged
merged 9 commits into from
Apr 19, 2020
7 changes: 3 additions & 4 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -1427,6 +1427,8 @@ def load_from_checkpoint(
it stores the hyperparameters in the checkpoint if you initialized your :class:`LightningModule`
with an argument called ``hparams`` which is a :class:`~argparse.Namespace`
(output of :meth:`~argparse.ArgumentParser.parse_args` when parsing command line arguments).
Any additional arguments that are not included in hparams can be passed to the model through
\*args and \*\*kwargs.

Example:
.. code-block:: python
Expand Down Expand Up @@ -1537,10 +1539,7 @@ def _load_model_state(cls, checkpoint: Dict[str, Any], *args, **kwargs) -> 'Ligh

# load the state_dict on the model automatically
model_args = [hparams] if hparams else []
if len(model_args) > 0:
model = cls(*model_args)
else:
model = cls(*args, **kwargs)
model = cls(*model_args, *args, **kwargs)
model.load_state_dict(checkpoint['state_dict'])

# give model a chance to load something
Expand Down