-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Support ConvNeXt-V2 backbone. #1294
Conversation
Codecov ReportBase: 0.02% // Head: 88.61% // Increases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## dev-1.x #1294 +/- ##
============================================
+ Coverage 0.02% 88.61% +88.58%
============================================
Files 121 161 +40
Lines 8217 12679 +4462
Branches 1368 2034 +666
============================================
+ Hits 2 11235 +11233
+ Misses 8215 1106 -7109
- Partials 0 338 +338
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should there be another pull request to change the usage of build_activation_layer
and build_norm_layer
in all models?
The code LGTM, please add some docs and UT
Hi, will the ConvNeXt-V2 implementation be able to be trained with MMSelfSup? |
Modification
Checklist
Before PR:
After PR: