-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add unpatch to dropconnect #198
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Line 58 in test/bayesian/dropout_test
can be replaced by your new test util is_deterministic
right?
and line 36 in test/bayesian/dropconnect_test.py
if not changed: | ||
warnings.warn("No layer was modified by patch_module!", UserWarning) | ||
return module | ||
|
||
|
||
def _patch_layers(module: torch.nn.Module, layers: Sequence, weight_dropout: float) -> bool: | ||
def unpatch_module(module: torch.nn.Module, inplace: bool = True) -> torch.nn.Module: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way that we can reduce the duplications for patch_module
and unpatch_module
?
I understand that we don't want to expose the user to the unnecessary complexities hence not using the mapping_fn
as an input so maybe there is not other way around
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are right, we could uniformize all of these.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
amazing :D lets go 👍🏼
Summary:
This is a bit hacky as we need to keep the "old" Dropout value, but I think this does what it needs.
Features:
Checklist:
tests/documentation_test.py
).