Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for captum toplevel-import. #912

Closed
wants to merge 2 commits into from

Conversation

dkrako
Copy link
Contributor

@dkrako dkrako commented Mar 24, 2022

This fixes issue #680 Strange import issue --> AttributeError: module 'captum' has no attribute 'attr'

In most python packages, you can import the toplevel package, like numpy, scipy, torch, etc.. and then access the submodules simply by the dot-operator. Like you can use import numpy and after that you can use any submodules à la numpy.random.uniform.

With this PR, you can just import captum and then for example use captum.attr.DeepLift or captum.robust.Perturbation instead of having to import both. It's just a small convenience, and I think there are more people that expect this kind of import to work but don't bother to create an issue out of this.

I hope this PR is considered as helpful.

@facebook-github-bot
Copy link
Contributor

Hi @dkrako!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks!

@facebook-github-bot
Copy link
Contributor

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@NarineK NarineK requested review from vivekmig and aobo-y March 29, 2022 00:33
@NarineK
Copy link
Contributor

NarineK commented May 9, 2022

Thank you for the PR, @dkrako! @aobo-y, @vivekmig do you mind checking whether there is import runtime performance related issues with this change ?

@vivekmig
Copy link
Contributor

vivekmig commented Jun 9, 2022

Thanks so much for adding this @dkrako !

Thank you for the PR, @dkrako! @aobo-y, @vivekmig do you mind checking whether there is import runtime performance related issues with this change ?

This seems fine, import captum is slower, but comparable to importing captum.attr currently, so should be fine considering the benefits of having this import.

@facebook-github-bot
Copy link
Contributor

@vivekmig has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

facebook-github-bot pushed a commit that referenced this pull request Jul 25, 2022
Summary:
Fixes: #988

According the `setup.py` and the `README`, users wishing to use insights need to use the custom install option provided or install the modules separately. https://github.com/pytorch/captum/blob/master/setup.py#L54, https://github.com/pytorch/captum/blob/master/README.md#installation

Therefore the insights module should not be loaded in the `__init__.py` file, and users will have to call it like before the changes proposed in #912, and added to the master branch in 9305b10

Pull Request resolved: #992

Reviewed By: Reubend

Differential Revision: D38026733

Pulled By: NarineK

fbshipit-source-id: d9bf407f461c2c1381291480654b8fc6923579c0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants