-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for retain_graph=True #356
Labels
Comments
momeara
added
feature
Is an improvement or enhancement
help wanted
Open to be worked on
labels
Oct 11, 2019
why don’t we put a hook for the backward pass and you can override if you need retain_graph? that keeps it flexible for anything else someone wants to do later |
perfect! that keeps with the overall philosophy, which has been very useful btw
… On Oct 12, 2019, at 5:47 AM, William Falcon ***@***.***> wrote:
why don’t we put a hook for the backward pass and you can override if you need retain_graph? that keeps it flexible for anything else someone wants to do later
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem? Please describe.
Some models require retain_graph=True, but it's not possible to set it in the .backward() call inside of Trainer.__run_training_batch(...)
Describe the solution you'd like
Add train_graph member function the LightningModule have the trainer read this option and then pass it into the .backward() call.
Describe alternatives you've considered
Driving a version of Trainer to support retain_graph=True is tough because __run_training_batch and other functions are name-mangled.
The text was updated successfully, but these errors were encountered: