Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add unpatch to dropconnect #198

Merged
merged 10 commits into from
Apr 5, 2022
Merged

Add unpatch to dropconnect #198

merged 10 commits into from
Apr 5, 2022

Conversation

Dref360
Copy link
Member

@Dref360 Dref360 commented Mar 29, 2022

Summary:

This is a bit hacky as we need to keep the "old" Dropout value, but I think this does what it needs.

Features:

Checklist:

  • Your code is documented (To validate this, add your module to tests/documentation_test.py).
  • Your code is tested with unit tests.
  • You moved your Issue to the PR state.

@Dref360 Dref360 requested a review from parmidaatg March 29, 2022 19:21
@Dref360 Dref360 marked this pull request as ready for review March 29, 2022 22:04
Copy link
Collaborator

@parmidaatg parmidaatg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Line 58 in test/bayesian/dropout_test can be replaced by your new test util is_deterministic right?
and line 36 in test/bayesian/dropconnect_test.py

if not changed:
warnings.warn("No layer was modified by patch_module!", UserWarning)
return module


def _patch_layers(module: torch.nn.Module, layers: Sequence, weight_dropout: float) -> bool:
def unpatch_module(module: torch.nn.Module, inplace: bool = True) -> torch.nn.Module:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a way that we can reduce the duplications for patch_module and unpatch_module ?
I understand that we don't want to expose the user to the unnecessary complexities hence not using the mapping_fn as an input so maybe there is not other way around

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are right, we could uniformize all of these.

Copy link
Collaborator

@parmidaatg parmidaatg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

amazing :D lets go 👍🏼

@parmidaatg parmidaatg merged commit ff77819 into master Apr 5, 2022
@parmidaatg parmidaatg deleted the BAAL-189/unpatch_dropconnect branch April 5, 2022 20:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants