Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update sparseml to support torch 2.0 #1618

Merged
merged 5 commits into from
Jun 13, 2023
Merged

Update sparseml to support torch 2.0 #1618

merged 5 commits into from
Jun 13, 2023

Conversation

dsikka
Copy link
Contributor

@dsikka dsikka commented Jun 9, 2023

  • Updated max version of torch to be 2.0, updated torchvision max to be 0.15.1
  • Updated max torchaudio to be 2.0.1
  • Tested and passed all pytorch tests
  • Tested updated package versions in training using sparseml.image_classification.train. Used the ImageNette dataset and ResNet-50 model from SparseZoo
# Epoch and Learning-Rate variables
num_epochs: 3.0
init_lr: 0.0005

# quantization variables
quantization_epochs: 1.0

training_modifiers:
  - !EpochRangeModifier
    start_epoch: 0.0
    end_epoch: eval(num_epochs)

  - !LearningRateFunctionModifier
    final_lr: 0.0
    init_lr: eval(init_lr)
    lr_func: cosine
    start_epoch: 0.0
    end_epoch: eval(num_epochs)

# Phase 1 Sparse Transfer Learning / Recovery
sparse_transfer_learning_modifiers:
  - !ConstantPruningModifier
    start_epoch: 0.0
    params: __ALL_PRUNABLE__

# Phase 2 Apply quantization
sparse_quantized_transfer_learning_modifiers:
  - !QuantizationModifier
    start_epoch: eval(num_epochs - quantization_epochs)
  • Trained using the sample recipe above and then exported in onnx using sparseml.image_classification.export_onnx
  • Confirmed size of the onnx export model to match another model trained/exported using torch 1.13.1 (23 mb) and validated graph using Netron thanks to @KSGulin

@dsikka dsikka changed the title bump up torch to 2.0, torchvision for compatability Update sparseml to support torch 2.0 Jun 9, 2023
@dsikka dsikka marked this pull request as ready for review June 9, 2023 18:58
@dsikka dsikka requested a review from rahul-tuli June 9, 2023 20:33
Copy link
Member

@bfineran bfineran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @dsikka - one last thing to look for is exporting quantized models with convert_qat=True and verifying that the weights are properly compressed (ie load a quantized resnet pytorch model from the sparsezoo, export it to onnx and verify that the file size is the expected quantized one)

@dbogunowicz
Copy link
Contributor

Oh, wow, this seems to be much less effort than expected no?

@dsikka
Copy link
Contributor Author

dsikka commented Jun 12, 2023

Size of the exported model is 23 mb using torch 2.0, identical to an onnx exported resnet model trained with torch 1.13.1

thanks @dsikka - one last thing to look for is exporting quantized models with convert_qat=True and verifying that the weights are properly compressed (ie load a quantized resnet pytorch model from the sparsezoo, export it to onnx and verify that the file size is the expected quantized one)

@dsikka dsikka requested review from bfineran and KSGulin June 12, 2023 18:06
bfineran
bfineran previously approved these changes Jun 13, 2023
Copy link
Member

@rahul-tuli rahul-tuli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM after outdated comment removal!

src/sparseml/pytorch/base.py Outdated Show resolved Hide resolved
src/sparseml/pytorch/base.py Outdated Show resolved Hide resolved
@dsikka dsikka merged commit c3612a1 into main Jun 13, 2023
10 checks passed
@dsikka dsikka deleted the dipika_torch2 branch June 13, 2023 22:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants