Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
williamFalcon authored and tullie committed May 6, 2020
1 parent 2e3586d commit 3ecda02
Showing 1 changed file with 6 additions and 5 deletions.
11 changes: 6 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ pip install pytorch-lightning
- [0.5.3.2](https://pytorch-lightning.readthedocs.io/en/0.5.3.2/)

## Demo
[MNIST, GAN, BERT, DQN on COLAB!](https://colab.research.google.com/drive/1F_RNcHzTfFuQf-LeKvSlud6x7jXYkG31#scrollTo=HOk9c4_35FKg)
[MNIST, GAN, BERT, DQN on COLAB!](https://colab.research.google.com/drive/1F_RNcHzTfFuQf-LeKvSlud6x7jXYkG31#scrollTo=HOk9c4_35FKg)
[MNIST on TPUs](https://colab.research.google.com/drive/1-_LKx4HwAxl5M6xPJmqAAu444LTDQoa3)

## What is it?
Expand Down Expand Up @@ -83,10 +83,11 @@ And for the stuff that the Trainer abstracts out you can [override any part](htt
For example, here you could do your own backward pass

```python
def optimizer_step(self, current_epoch, batch_idx, optimizer, optimizer_idx,
second_order_closure=None):
optimizer.step()
optimizer.zero_grad()
class LitModel(LightningModule):
def optimizer_step(self, current_epoch, batch_idx, optimizer, optimizer_idx,
second_order_closure=None):
optimizer.step()
optimizer.zero_grad()
```

For anything else you might need, we have an extensive [callback system](https://pytorch-lightning.readthedocs.io/en/latest/introduction_guide.html#callbacks) you can use to add arbitrary functionality not implemented by our team in the Trainer.
Expand Down

0 comments on commit 3ecda02

Please sign in to comment.