Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix vLLM integration #711

Merged
merged 6 commits into from
Feb 27, 2024
Merged

Fix vLLM integration #711

merged 6 commits into from
Feb 27, 2024

Conversation

saattrupdan
Copy link
Contributor

@saattrupdan saattrupdan commented Feb 27, 2024

When integrating Outlines with vLLM I faced the following issues, which are fixed in this PR:

  1. When calling vllm.LLM.generate then within the internals of vLLM a copy.deepcopy of the vLLM SamplingParams is made, which includes the logits processor from Outlines (RegexLogitsProcessor, say). This requires everything to be pickleable, and the RegexLogitsProcessor.fsm.vocabulary is a dict_values object, which doesn't satisfy that. The fix is easy: just convert it to a list. This doesn't affect how this vocabulary variable is being used in the code.
  2. The RegexLogitsProcessor takes an llm argument, which the docstring states should be a vllm.LLM object, but then attempts to extract the underlying tokenizer via llm.tokenizer.tokenizer. The tokenizer of vllm.LLM currently lies in the vllm.LLM.llm_engine.tokenizer.tokenizer attribute, but this is a big mess and isn't backwards compatible with previous vLLM versions. Instead, they have a convenience method, vllm.LLM.get_tokenizer, which fetches the tokenizer. To remain backwards compatibility, in case people have supplied vllm.LLM.llm_engine directly into RegexLogitsProcessor, it falls back to a tokenizer or tokenizer.tokenizer attribute.

I also updated the vLLM example script, as that was outdated as well (used the previous _patched_apply_logits_processors).

Closes #704

@rlouf
Copy link
Member

rlouf commented Feb 27, 2024

Thank you for taking the time to submit a PR. This looks great!

@rlouf rlouf merged commit d938678 into outlines-dev:main Feb 27, 2024
5 checks passed
@saattrupdan saattrupdan deleted the fix/vllm-integration branch February 27, 2024 22:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Compatibility issue with vllm 0.32
2 participants