Skip to content

SparseML v1.1.0

Compare
Choose a tag to compare
@jeanniefinks jeanniefinks released this 25 Aug 19:29
· 2 commits to release/1.1 since this release
c0103e7

New Features:

  • YOLACT Segmentation native training integration made for SparseML.
  • OBSPruning modifier added (https://arxiv.org/abs/2203.07259).
  • QAT now supported for MobileBERT.
  • Custom module support provided for QAT to enable quantization of layers such as GELU.

Changes:

  • Updates made across the repository for new SparseZoo Python APIs.
  • Non-string keys are now supported in recipes for layer and module names.
  • Native support added for DDP training with pruning in PyTorch pathways.
  • YOLOV5p6 models default to their native activations instead of overwriting to Hardswish.
  • Transformers eval pathways changed to turn off Amp (fFP16) to give more stable results.
  • TensorBoard logger added to transformers integration.
  • Python setuptools set as required at 59.5 to avoid installation issues with other packages.
  • DDP now works for quantized training of embedding layers where tensors were being placed on incorrect devices and causing training crashes.

Resolved Issues:

  • ConstantPruningModifier propagated None in place of the start_epoch value when start_epoch > 0. It now propagates the proper value.
  • Quantization of BERT models were dropping accuracy improperly by quantizing the identify branches.
  • SparseZoo stubs were not loading model weights for image classification pathways when using DDP training.

Known Issues:

  • None