You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe these docs are outdated, thanks for reporting.
The order was recently changed: #6147
zero_grad comes before backward and that's, therefore I would say on_before_zero_grad called before on_after_backward is also correct.
I would just need a method that is called right after optimizer_step() so if there is any alternative please let me know.
Thanks in advance.
Maybe in LightningModule:
defoptimizer_step(self, *args, **kwargs):
super().optmizer_step(*args, **kwargs)
# do something after
🐛 Hook orders are different to what is documented
In the documentation it says the order of called methods in the train loop is like the following:
It furthermore says in the description of on_after_backward:
For before_zero_grad it says:
which both matches with the above defined training loop.
However, if I use this methods like in the code below on_before_zero_grad is always called before on_after_backward.
Reproduction
I've attached the code from the README.md from the github repo slightly modified by adding the above mentioned methods
Expected behavior
The methods are called in the order as described in the documentation.
Environment
environment.yml file:
conda
,pip
, source): conda env create -f environment.ymlAdditional context
I would just need a method that is called right after optimizer_step() so if there is any alternative please let me know.
Thanks in advance.
The text was updated successfully, but these errors were encountered: