Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature] Support for specifying epochs to stop knowledge distillation #455

Merged
merged 11 commits into from
Mar 1, 2023

Conversation

HIT-cwh
Copy link
Collaborator

@HIT-cwh HIT-cwh commented Feb 10, 2023

Modification

  1. Add a new attribute distill_loss_detach to SingleTeacherDistill to control whether the distillation loss is used.
  2. Add a new hook named DistillationLossDetachHook to stop distillation at a certain epoch.
  3. Add the corresponding pytest.

mmrazor/engine/hooks/distillation_loss_detach_hook.py Outdated Show resolved Hide resolved
mmrazor/engine/hooks/distillation_loss_detach_hook.py Outdated Show resolved Hide resolved
@pppppM pppppM merged commit 0919f69 into open-mmlab:dev-1.x Mar 1, 2023
@HIT-cwh HIT-cwh deleted the rtm_distill branch March 2, 2023 04:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants