Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]fix: split autoslim different checkpoint has equal model size #193

Merged
merged 3 commits into from
Jul 5, 2022

Conversation

Hiwyl
Copy link
Contributor

@Hiwyl Hiwyl commented Jun 28, 2022

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

autoslim get subnet by split_checkpoints is wrone.

Modification

module.weight = nn.Parameter(temp_weight.data.clone())
module.bias = nn.Parameter(module.bias.data[:out_channels].clone())

BC-breaking (Optional)

Does the modification introduce changes that break the backward compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here and update the documentation.

Checklist

Before PR:

  • Pre-commit or other linting tools are used to fix the potential lint issues.
  • Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
  • The modification is covered by complete unit tests. If not, please add more unit tests to ensure the correctness.
  • The documentation has been modified accordingly, like docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with those projects, like MMDet or MMSeg.
  • CLA has been signed and all committers have signed the CLA in this PR.

@Hiwyl Hiwyl changed the title fix: split autoslim different checkpoint has equal model size [Bug]fix: split autoslim different checkpoint has equal model size Jun 28, 2022
@codecov
Copy link

codecov bot commented Jun 28, 2022

Codecov Report

Merging #193 (78380e7) into master (3cc359e) will increase coverage by 1.93%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #193      +/-   ##
==========================================
+ Coverage   63.95%   65.88%   +1.93%     
==========================================
  Files          93       93              
  Lines        3462     3462              
  Branches      640      640              
==========================================
+ Hits         2214     2281      +67     
+ Misses       1136     1077      -59     
+ Partials      112      104       -8     
Flag Coverage Δ
unittests 65.85% <100.00%> (+1.90%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmrazor/models/pruners/structure_pruning.py 91.93% <100.00%> (+2.30%) ⬆️
mmrazor/utils/misc.py 100.00% <0.00%> (+4.76%) ⬆️
mmrazor/models/algorithms/autoslim.py 71.31% <0.00%> (+14.72%) ⬆️
mmrazor/utils/setup_env.py 100.00% <0.00%> (+27.27%) ⬆️
mmrazor/models/pruners/ratio_pruning.py 90.41% <0.00%> (+30.13%) ⬆️
mmrazor/models/pruners/utils/switchable_bn.py 100.00% <0.00%> (+69.23%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3cc359e...78380e7. Read the comment docs.

@pppppM
Copy link
Collaborator

pppppM commented Jun 28, 2022

Hi @Hiwyl, thank you very much for your pr.

OpenMMLab use pre-commit hook to avoid issues of code style.
This PR currently fails lint checks.

You can use the following two commands to get started with the pre-commit hook quickly.

pip install pre-commit
pre-commit run --all-files

The installation of pre-commit may be affected by the network.
If you are inconvenient to use pre-commit, you can contact us, and my colleagues can directly fix the lint in your branch.

@Hiwyl
Copy link
Contributor Author

Hiwyl commented Jun 28, 2022

Has been updated.

@@ -844,8 +853,7 @@ def concat_backward_parser(self, grad_fn, module2name, var2module,
# However, a shared module will be visited more than once during
# forward, so it is still need to be traced even if it has been
# visited.
if (name in visited and visited[name]
and name not in self.shared_module):
if name in visited and visited[name] and name not in self.shared_module:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line is too long to pass the lint check.
More details can be found at https://github.com/open-mmlab/mmrazor/runs/7088305507?check_suite_focus=true

@pppppM
Copy link
Collaborator

pppppM commented Jun 28, 2022

LGTM if passed CI

@pppppM pppppM merged commit 1abad08 into open-mmlab:master Jul 5, 2022
@pppppM pppppM linked an issue Sep 7, 2022 that may be closed by this pull request
@pppppM pppppM mentioned this pull request Sep 7, 2022
humu789 pushed a commit to humu789/mmrazor that referenced this pull request Feb 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

autoslim后的模型测试
3 participants