Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/add transformers integration #728

Merged
merged 1 commit into from
Mar 12, 2024
Merged

Feat/add transformers integration #728

merged 1 commit into from
Mar 12, 2024

Conversation

saattrupdan
Copy link
Contributor

This adds integration with the transformers package, by supplying a prefix_allowed_tokens_fn that can be supplied to both transformers pipelines and directly to generate methods of generative models from the transformers package.

The transformers integration has been put in an integrations top-level directory, and the existing vLLM integration has also been moved into that directory. We keep a reference in the previous serve/vllm.py module, to preserve backwards compatibility.

One notable difference between vLLM and transformers is that transformers include the input in the generated sequences. Since this can mess up the FSMs, we keep track of the input tokens, and use these both to identify when we change to new samples (resetting the FSM) as well as to know which token ID prefix to remove, before getting the next state from the FSM.

Also added an example with the transformers integration.

Closes #713

@saattrupdan saattrupdan requested a review from rlouf March 6, 2024 15:46
@rlouf
Copy link
Member

rlouf commented Mar 8, 2024

Could you rebase your branch on main to remove the merge commits? I haven't had time to test it locally yet, but will get around to it soon. It would be great to also move the integration logic that's in outlines.models.llamacpp.py in outlines.integrations, if you have time to do it.

@saattrupdan
Copy link
Contributor Author

It would be great to also move the integration logic that's in outlines.models.llamacpp.py in outlines.integrations, if you have time to do it.

Sure thing. Is it correctly understood that this concerns LogitsProcessor, RegexLogitsProcessor and CFGLogitsProcessor in the models.llamacpp module that should be moved to integrations.llamacpp?

Could you rebase your branch on main to remove the merge commits?

Sure thing, I'll do that when the llamacpp integration is set up.

@rlouf rlouf mentioned this pull request Mar 8, 2024
@saattrupdan
Copy link
Contributor Author

@rlouf The LlamaCpp integration has been moved now, and have also adapted the imports in the models.llamacpp module to point to integrations.llamacpp now. Did a rebase as well. Hopefully it's ready for a merge now!

@rlouf
Copy link
Member

rlouf commented Mar 9, 2024

Will take a look early next week!

outlines/generate/api.py Outdated Show resolved Hide resolved
@saattrupdan saattrupdan requested a review from rlouf March 11, 2024 09:42
@rlouf
Copy link
Member

rlouf commented Mar 11, 2024

I just fixed merged conflicts due to a big refactor of the FSM interface. I will take a closer look at the PR tomorrow.

@Kamakshi8104
Copy link

Can this be used with multimodal models too?

@rlouf
Copy link
Member

rlouf commented Mar 12, 2024

Can this be used with multimodal models too?

It should!

@Kamakshi8104
Copy link

thanks!! Can't wait to try it out!!

@rlouf
Copy link
Member

rlouf commented Mar 12, 2024

I was able to run the example locally, and tried the vLLM integration as well; llama.cpp is covered by the integration tests. Great job!

@Kamakshi8104 would you like to add an example using a multimodal model to the docs?

@rlouf rlouf merged commit 5d97ee1 into outlines-dev:main Mar 12, 2024
5 checks passed
@saattrupdan saattrupdan deleted the feat/add-transformers-integration branch March 12, 2024 14:09
@Kamakshi8104
Copy link

Yes I can add an example🙂. I am in the middle of exams so will get started on it this coming week👍

@jeremyallenjacobson
Copy link

Yes I can add an example🙂. I am in the middle of exams so will get started on it this coming week👍

Any update on this? Very interested in a seeing a multimodal example, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add integration with Hugging Face transformers
4 participants