-
Notifications
You must be signed in to change notification settings - Fork 479
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate to register_full_backward_hook #837
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Approving, made minor doc suggestions.
@vivekmig has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Does this warning mean the output for GuidedBackPropagation will not be correct? (I am currently using the following pretrained torchvision models: Alexnet, VGG16, Resnet50, DenseNet121) |
Modifying all module backward hooks to utilize the new register_full_backward hook API documented here. This new API resolves many issues we previously encountered with backward module hooks.
Since this API is available only in torch 1.8, allowing a fall-back option to the original backward hook approach.
Due to issues described here, we are also deprecating attribution with respect to neuron outputs for NeuronDeepLift, NeuronGuidedBackprop, and NeuronDeconvolution; these methods require attributing with respect to neuron input (which is typically equivalent to attributing with respect to the previous layer output).
Additionally, in-place modules are no longer supported for full backward hooks, so these are no longer supported for DeepLift, LRP, and GuidedBackprop / Deconvolution and corresponding variants. Documentation has been updated accordingly.