Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug/sg 861 decouple qat from train from config #1001

Merged
merged 17 commits into from
May 23, 2023

Conversation

shaydeci
Copy link
Collaborator

This PR decouples QAT/PTQ from train_from_config.
The goal is to let users launch PTQ/QAT using Python instead of CLI + configs, while using as much defaults as possible.
Note that automatic adaptation of the parameters using the best practices is the users responsibilty when they choose to launch this way, since the new "quantize" method I added can expect objects rather then parameters.

  • Renamed train_from_config in QATTrainer to quantize_from_config.
  • Introduced a "quantize" method.
  • Added the option to pass little parameters as possible to let users use existing objects which are already set in train().
  • I added a simple test that takes a model from train to QAT to our recipe test suite (modified th ci config to install pytorch quantisation beforehand - it is possible since these tests run on gpu on spot instances).
  • Added alot of docs as there were barely any.

@dagshub
Copy link

dagshub bot commented May 14, 2023

@shaydeci shaydeci marked this pull request as ready for review May 14, 2023 14:48
@shaydeci shaydeci requested a review from spsancti May 15, 2023 07:51
Copy link
Contributor

@Louis-Dupont Louis-Dupont left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great!
I wrote a small note about naming (non-blocking)

Other than that I think we should also add this as a "quickstart" section in the ptq/qat tutorial: https://github.com/Deci-AI/super-gradients/blob/master/documentation/source/ptq_qat.md

Copy link
Contributor

@Louis-Dupont Louis-Dupont left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great like that
I think there might be one bug though, see below, and the rest is just minor comments

src/super_gradients/qat_from_recipe.py Outdated Show resolved Hide resolved
src/super_gradients/training/utils/quantization/export.py Outdated Show resolved Hide resolved
src/super_gradients/training/sg_trainer/sg_trainer.py Outdated Show resolved Hide resolved
src/super_gradients/training/sg_trainer/sg_trainer.py Outdated Show resolved Hide resolved
src/super_gradients/training/sg_trainer/sg_trainer.py Outdated Show resolved Hide resolved
src/super_gradients/training/sg_trainer/sg_trainer.py Outdated Show resolved Hide resolved
Copy link
Contributor

@Louis-Dupont Louis-Dupont left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@shaydeci shaydeci merged commit 0e1e5ab into master May 23, 2023
1 check passed
@shaydeci shaydeci deleted the bug/SG-861_decouple_qat_from_train_from_config branch May 23, 2023 14:14
geoffrey-g-delhomme pushed a commit to geoffrey-g-delhomme/super-gradients that referenced this pull request May 26, 2023
* adde unit tests

* changed local

* switch to ema model before quantization if exists

* midifying method complete

* midifying method cal in pre launch callback

* removed option to get the defaults from previous training

* added unit tests passing

* updated docs and test names

* moved logger init

* comments resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants