Skip to content

MMRazor V1.0.0 Release

Latest
Compare
Choose a tag to compare
@humu789 humu789 released this 24 Apr 09:52
· 4 commits to main since this release
d9c5bc8

v1.0.0 (24/04/2023)

We are excited to announce the first official release of MMRazor 1.0.

Highlights

  • MMRazor quantization is released, which has got through task models and model deployment. With its help, we can quantize and deploy pre-trained models in OpenMMLab to specified backend quickly.

New Features & Improvements

NAS

  • Update searchable model. (#438)
  • Update NasMutator to build search_space in NAS. (#426)

Pruning

  • Add a new pruning algorithm named GroupFisher. We support the full pipeline for GroupFisher, including pruning, finetuning and deployment.(#459)

KD

  • Support stopping distillation after a certain epoch. (#455)
  • Support distilling rtmdet with mmrazor, refer to here. (open-mmlab/mmyolo#544)
  • Add mask channel in MGD Loss. (#461)

Quantization

  • Support two quantization types: QAT and PTQ (#513)
  • Support various quantization bits. (#513)
  • Support various quantization methods, such as per_tensor / per_channel, symmetry / asymmetry and so on. (#513)
  • Support deploy quantized models to multiple backends, such as OpenVINO, TensorRT and so on. (#513)
  • Support applying quantization algorithms to multiple task repos directly, such as mmcls, mmdet and so on. (#513)

Bug Fixes

  • Fix split in Darts config. (#451)
  • Fix a bug in Recorders. (#446)
  • Fix a bug when using get_channel_unit.py. (#432)
  • Fix a bug when deploying a pruned model to cuda. (#495)

Contributors

A total of 23 developers contributed to this release.
Thanks @415905716 @gaoyang07 @humu789 @LKJacky @HIT-cwh @aptsunny @cape-zck @vansin @twmht @wm901115nwpu @Hiwyl @NickYangMin @spynccat @sunnyxiaohu @kitecats @TinyTigerPan @twmht @yivona08 @xinxinxinxu @cape-zck @Weiyun1025 @vansin @Lxtccc

New Contributors

Full Changelog: v0.3.1...v1.0.0