Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove prefixlm support for OPT and Bloom #704

Merged
merged 6 commits into from
Oct 30, 2023

Conversation

dakinggg
Copy link
Collaborator

@dakinggg dakinggg commented Oct 30, 2023

Current code is broken on transformers main due to private functions being removed. We are just removing support for these models types as we don't think these code paths are used, and we would need to do further investigation to update to the latest masking code in transformers.

Closes #703

@dakinggg dakinggg changed the title Update to not use private functions from OPT and Bloom Remove prefixlm support for OPT and Bloom Oct 30, 2023
@dakinggg dakinggg marked this pull request as ready for review October 30, 2023 19:56
Copy link
Contributor

@alextrott16 alextrott16 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes to hf_prefixlm_converter.py look good, but I think you're throwing out more test code than you want.

tests/test_model.py Show resolved Hide resolved
Copy link
Contributor

@alextrott16 alextrott16 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Welcome back, test_opt_wrapping. Welcome back...

Approve!

@dakinggg dakinggg enabled auto-merge (squash) October 30, 2023 20:42
@dakinggg dakinggg merged commit db9227a into mosaicml:main Oct 30, 2023
12 checks passed
@dakinggg dakinggg deleted the tr-main-pr branch December 11, 2023 23:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

MPT models on the Hub not working with transformers main
2 participants