Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modifier Refactor OBCQ Implementation #1737

Merged
merged 189 commits into from
Oct 11, 2023
Merged

Modifier Refactor OBCQ Implementation #1737

merged 189 commits into from
Oct 11, 2023

Conversation

Satrat
Copy link
Contributor

@Satrat Satrat commented Sep 26, 2023

Summary

Implementation of OBCQ in the new modifiers framework, tested for OPT and Llama so far. This shouldn't be merged to main until #1713 is merged in. Main changes are in src/sparseml/modifiers/obcq

  • A new modifier is added for the SparseGPT algorithm, building off the code in src/sparseml/experimental/sparsegpt.
  • Added quantization modifier to the new modifier framework. Currently this modifier only supports one-shot
  • Added post one-shot calibration parameter for recalibrating quantization statistics after OBCQ
  • Dataset registry for c4,ptb,wikitext2,open_platypus

Instructions

To run for OPT:

python src/sparseml/transformers/sparsification/obcq/obcq.py facebook/opt-1.3b c4
    --recipe src/sparseml/transformers/sparsification/obcq/example.yaml

To run for Llama, you'll need a local copy of the model

python src/sparseml/transformers/sparsification/obcq/obcq.py /local/path/to/llama open_platypus
    --recipe src/sparseml/transformers/sparsification/obcq/example_llama.yaml

Testing

To compare the perplexity evaluation between the experimental implementation and productionized version of OPT run:

python src/sparseml/experimental/sparsegpt/examples/opt/compare_obcq.py

To compare perplexity for Llama run

python src/sparseml/experimental/sparsegpt/examples/llama2/compare_obcq.py

natuan and others added 30 commits August 17, 2023 23:28
…arsegpt/llama2

# Conflicts:
#	src/sparseml/experimental/sparsegpt/dispatch.py
#	src/sparseml/experimental/sparsegpt/layer_compressor.py
#	src/sparseml/experimental/sparsegpt/main.py
#	src/sparseml/experimental/sparsegpt/model_preprocessor.py
#	src/sparseml/experimental/sparsegpt/opt.py
#	src/sparseml/experimental/sparsegpt/sequential.py
#	src/sparseml/experimental/sparsegpt/sparsegpt.py
Base automatically changed from sparsification-refactor to main October 4, 2023 19:08
src/sparseml/modifiers/obcq/base.py Outdated Show resolved Hide resolved
src/sparseml/modifiers/obcq/pytorch.py Outdated Show resolved Hide resolved
src/sparseml/modifiers/obcq/pytorch.py Show resolved Hide resolved
src/sparseml/modifiers/obcq/pytorch.py Outdated Show resolved Hide resolved
src/sparseml/modifiers/obcq/utils/data.py Outdated Show resolved Hide resolved
src/sparseml/modifiers/obcq/utils/models.py Outdated Show resolved Hide resolved
src/sparseml/modifiers/obcq/utils/models.py Outdated Show resolved Hide resolved
src/sparseml/modifiers/obcq/utils/sparsegpt.py Outdated Show resolved Hide resolved
@Satrat Satrat requested a review from bfineran October 6, 2023 19:32
bfineran
bfineran previously approved these changes Oct 6, 2023
Copy link
Member

@bfineran bfineran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

latest commits look great. good to go from my end

src/sparseml/modifiers/obcq/base.py Outdated Show resolved Hide resolved
src/sparseml/transformers/data/base_llm.py Show resolved Hide resolved
Co-authored-by: Benjamin Fineran <bfineran@users.noreply.github.com>
bfineran
bfineran previously approved these changes Oct 9, 2023
@bfineran bfineran merged commit 6c3f054 into main Oct 11, 2023
11 checks passed
@bfineran bfineran deleted the refactor_obcq branch October 11, 2023 17:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants