Skip to content

Commit

Permalink
added warnings to unimplemented methods (Lightning-AI#1317)
Browse files Browse the repository at this point in the history
* added warnings and removed default optimizer

* opt

* Apply suggestions from code review

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
  • Loading branch information
2 people authored and akarnachev committed Apr 4, 2020
1 parent 321d9af commit b4a0413
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 2 deletions.
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added model configuration checking ([#1199](https://github.com/PyTorchLightning/pytorch-lightning/pull/1199))
- On DP and DDP2 unsqueeze is automated now ([#1319](https://github.com/PyTorchLightning/pytorch-lightning/pull/1319))
- Does not interfere with a default sampler ([#1318](https://github.com/PyTorchLightning/pytorch-lightning/pull/1318))
- Remove default Adam optimizer ([#1317](https://github.com/PyTorchLightning/pytorch-lightning/pull/1317))
- Give warnings for unimplemented required lightning methods ([#1317](https://github.com/PyTorchLightning/pytorch-lightning/pull/1317))
- Enhanced load_from_checkpoint to also forward params to the model ([#1307](https://github.com/PyTorchLightning/pytorch-lightning/pull/1307))
- Made `evalaute` method private >> `Trainer._evaluate(...)`. ([#1260](https://github.com/PyTorchLightning/pytorch-lightning/pull/1260))
- Made `evaluate` method private >> `Trainer._evaluate(...)`. ([#1260](https://github.com/PyTorchLightning/pytorch-lightning/pull/1260))

### Deprecated

Expand Down
12 changes: 11 additions & 1 deletion docs/source/introduction_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -269,7 +269,6 @@ In PyTorch we do it as follows:
In Lightning we do the same but organize it under the configure_optimizers method.
If you don't define this, Lightning will automatically use `Adam(self.parameters(), lr=1e-3)`.

.. code-block:: python
Expand All @@ -278,6 +277,17 @@ If you don't define this, Lightning will automatically use `Adam(self.parameters
def configure_optimizers(self):
return Adam(self.parameters(), lr=1e-3)
.. note:: The LightningModule itself has the parameters, so pass in self.parameters()

However, if you have multiple optimizers use the matching parameters

.. code-block:: python
class LitMNIST(pl.LightningModule):
def configure_optimizers(self):
return Adam(self.generator(), lr=1e-3), Adam(self.discriminator(), lr=1e-3)
Training step
^^^^^^^^^^^^^

Expand Down
3 changes: 3 additions & 0 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,6 +224,7 @@ def training_step(self, batch, batch_idx, hiddens):
The presented loss value in progress bar is smooth (average) over last values,
so it differs from values set in train/validation step.
"""
warnings.warn('`training_step` must be implemented to be used with the Lightning Trainer')

def training_end(self, *args, **kwargs):
"""
Expand Down Expand Up @@ -1079,6 +1080,7 @@ def configure_optimizers(self):
}
"""
warnings.warn('`configure_optimizers` must be implemented to be used with the Lightning Trainer')

def optimizer_step(
self,
Expand Down Expand Up @@ -1280,6 +1282,7 @@ def train_dataloader(self):
return loader
"""
warnings.warn('`train_dataloader` must be implemented to be used with the Lightning Trainer')

def tng_dataloader(self): # todo: remove in v1.0.0
"""Implement a PyTorch DataLoader.
Expand Down

0 comments on commit b4a0413

Please sign in to comment.