Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhance] Enable to toggle whether Gem Pooling is trainable or not. #1246

Merged

Conversation

yukkyo
Copy link
Contributor

@yukkyo yukkyo commented Dec 7, 2022

Motivation

Enable to toggle whether Gem Pooling is trainable or not.

reference

Modification

class GeneralizedMeanPooling(nn.Module):
    ...
    def __init__(self, p=3., eps=1e-6, clamp=True, p_trainable=True):
        ...
        if p_trainable:
            self.p = Parameter(torch.ones(1) * p)
        else:
            self.p = p
        ...

@CLAassistant
Copy link

CLAassistant commented Dec 7, 2022

CLA assistant check
All committers have signed the CLA.

@yukkyo yukkyo changed the title Enable to toggle whether Gem Pooling is trainable or not. [Enhance] Enable to toggle whether Gem Pooling is trainable or not. Dec 7, 2022
@codecov
Copy link

codecov bot commented Dec 7, 2022

Codecov Report

Base: 0.02% // Head: 86.86% // Increases project coverage by +86.83% 🎉

Coverage data is based on head (b362d92) compared to base (b8b31e9).
Patch has no changes to coverable lines.

Additional details and impacted files
@@             Coverage Diff              @@
##           dev-1.x    #1246       +/-   ##
============================================
+ Coverage     0.02%   86.86%   +86.83%     
============================================
  Files          121      166       +45     
  Lines         8217    13511     +5294     
  Branches      1368     2148      +780     
============================================
+ Hits             2    11736    +11734     
+ Misses        8215     1420     -6795     
- Partials         0      355      +355     
Flag Coverage Δ
unittests 86.86% <ø> (+86.83%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmcls/datasets/transforms/compose.py
mmcls/structures/utils.py 77.77% <0.00%> (ø)
mmcls/models/utils/layer_scale.py 86.66% <0.00%> (ø)
mmcls/models/classifiers/timm.py 25.97% <0.00%> (ø)
mmcls/structures/multi_task_data_sample.py 100.00% <0.00%> (ø)
mmcls/models/backbones/beit.py 57.06% <0.00%> (ø)
mmcls/datasets/multi_task.py 74.46% <0.00%> (ø)
mmcls/models/backbones/levit.py 96.06% <0.00%> (ø)
mmcls/models/retrievers/image2image.py 90.90% <0.00%> (ø)
mmcls/models/backbones/revvit.py 64.39% <0.00%> (ø)
... and 157 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@Ezra-Yu
Copy link
Collaborator

Ezra-Yu commented Dec 7, 2022

Please sign the CLA, so that I can review and merge this PR.

@yukkyo
Copy link
Contributor Author

yukkyo commented Dec 7, 2022

I signed in from the link above and got to the image below.
Please let me know if there is any other work that needs to be done.

image

@Ezra-Yu
Copy link
Collaborator

Ezra-Yu commented Dec 7, 2022

It is the commitor who has not signed the CLA.
image.

I'm not sure if this account is still in use. There are two solutions:

  1. Let this account link an email and then sign the CLA,

  2. rebase to modify the commit message including the commitor and email into your accout.

Copy link
Collaborator

@Ezra-Yu Ezra-Yu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

mmcls/models/necks/gem.py Outdated Show resolved Hide resolved
@mzr1996 mzr1996 force-pushed the feat/no_trainable_gem_pooling branch from 4d4285a to b362d92 Compare February 9, 2023 03:01
@mzr1996 mzr1996 merged commit 4ce7be1 into open-mmlab:dev-1.x Feb 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants