Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hook Removal #340

Closed
wants to merge 32 commits into from
Closed

Hook Removal #340

wants to merge 32 commits into from

Conversation

vivekmig
Copy link
Contributor

@vivekmig vivekmig commented Apr 1, 2020

This PR ensures that all hooks are removed, even if attribution raises an error while the hook is placed. Corresponding generated tests are also added, which simulate errors during runtime and verify that no hooks remain on the model after execution.

Note that this PR depends on the generated tests, so code is included here as well, will resolve this once the updated tests PR is merged.

@vivekmig vivekmig requested a review from NarineK April 1, 2020 20:39
@vivekmig
Copy link
Contributor Author

vivekmig commented Apr 8, 2020

@NarineK Updated this PR, it should be ready for review now, thanks!

Copy link
Contributor

@NarineK NarineK left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Thank you! Couple suggestions.

  1. We could use try-finally and remove the hooks in the finally block once and avoid second remove for the normal case and except-raise-error blocks.
  2. It looks like here we might also need a try-finally block ?
    https://github.com/pytorch/captum/blob/master/captum/attr/_utils/gradient.py#L240

captum/attr/_core/deep_lift.py Show resolved Hide resolved
captum/attr/_core/guided_backprop_deconvnet.py Outdated Show resolved Hide resolved
read the documentation in test_config.py and add cases based on the
schema described there.
"""

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably also need to do try-finally for this one ?
https://github.com/pytorch/captum/blob/master/captum/attr/_utils/gradient.py#L240

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this line is already updated above? But will switch from except to finally.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vivekmig has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@vivekmig
Copy link
Contributor Author

Thanks for the review!

  1. Agreed, that's more concise, switched to try-finally, thanks!
  2. I think that line was already handled in the changes to gradient.py? But changed from except to finally.

@facebook-github-bot
Copy link
Contributor

@vivekmig merged this pull request in 4cde54f.

NarineK pushed a commit to NarineK/captum-1 that referenced this pull request Nov 19, 2020
Summary:
This PR ensures that all hooks are removed, even if attribution raises an error while the hook is placed. Corresponding generated tests are also added, which simulate errors during runtime and verify that no hooks remain on the model after execution.

Note that this PR depends on the generated tests, so code is included here as well, will resolve this once the updated tests PR is merged.
Pull Request resolved: pytorch#340

Reviewed By: edward-io

Differential Revision: D21020096

Pulled By: vivekmig

fbshipit-source-id: 05efaf9021222318d3adf2b029592754d7c836e7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants