We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I attempted to use gemma2b to perform a simple test on outlines; however, I unfortunately encountered the following error.
[rank0]: Traceback (most recent call last): [rank0]: File "1.py", line 31, in [rank0]: character = generator( [rank0]: ^^^^^^^^^^ [rank0]: File "miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/api.py", line 511, in __call__ [rank0]: return format(completions) [rank0]: ^^^^^^^^^^^^^^^^^^^ [rank0]: File "miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/api.py", line 497, in format [rank0]: return self.format_sequence(sequences) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/json.py", line 60, in [rank0]: generator.format_sequence = lambda x: pyjson.loads(x) [rank0]: ^^^^^^^^^^^^^^^ [rank0]: File "miniconda3/envs/mamba/lib/python3.11/json/__init__.py", line 346, in loads [rank0]: return _default_decoder.decode(s) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "miniconda3/envs/mamba/lib/python3.11/json/decoder.py", line 337, in decode [rank0]: obj, end = self.raw_decode(s, idx=_w(s, 0).end()) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "miniconda3/envs/mamba/lib/python3.11/json/decoder.py", line 353, in raw_decode [rank0]: obj, end = self.scan_once(s, idx) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^ [rank0]: json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 38 (char 37)
from vllm.sampling_params import SamplingParams from outlines import models, generate import outlines import outlines schema = """{ "$defs": { "Armor": { "enum": ["leather", "chainmail", "plate"], "title": "Armor", "type": "string" } }, "properties": { "name": {"maxLength": 10, "title": "Name", "type": "string"}, "age": {"title": "Age", "type": "integer"}, "armor": {"$ref": "#/$defs/Armor"}, "strength": {"title": "Strength", "type": "integer"}\ }, "required": ["name", "age", "armor", "strength"], "title": "Character", "type": "object" }""" model = models.vllm("PATH_TO_GEMMA2B",tensor_parallel_size=8) generator = outlines.generate.json(model, schema) character = generator( "Generate a new character for my awesome game: " + "name, age (between 1 and 99), armor and strength. " ) print(character)
.
No response
Version information
(command output here)
The text was updated successfully, but these errors were encountered:
When I try llama3-8B-instruct, I get:
[rank0]: Traceback (most recent call last): [rank0]: File "/data/ruanjh/best_training_method/11.py", line 31, in <module> [rank0]: character = generator( [rank0]: ^^^^^^^^^^ [rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/api.py", line 511, in __call__ [rank0]: return format(completions) [rank0]: ^^^^^^^^^^^^^^^^^^^ [rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/api.py", line 497, in format [rank0]: return self.format_sequence(sequences) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/json.py", line 60, in <lambda> [rank0]: generator.format_sequence = lambda x: pyjson.loads(x) [rank0]: ^^^^^^^^^^^^^^^ [rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/json/__init__.py", line 346, in loads [rank0]: return _default_decoder.decode(s) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/json/decoder.py", line 337, in decode [rank0]: obj, end = self.raw_decode(s, idx=_w(s, 0).end()) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/json/decoder.py", line 353, in raw_decode [rank0]: obj, end = self.scan_once(s, idx) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^ [rank0]: json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 32 (char 31) ERROR 07-30 19:03:51 multiproc_worker_utils.py:120] Worker VllmWorkerProcess pid 122794 died, exit code: -15 INFO 07-30 19:03:51 multiproc_worker_utils.py:123] Killing local vLLM worker processes /data/ruanjh/miniconda3/envs/mamba/lib/python3.11/multiprocessing/resource_tracker.py:254: UserWarning: resource_tracker: There appear to be 2 leaked shared_memory objects to clean up at shutdown warnings.warn('resource_tracker: There appear to be %d '
Sorry, something went wrong.
Likely related to #985
I'm working on a fix to a few json schema issues which have appeared. Thank you for your patience.
Successfully merging a pull request may close this issue.
I attempted to use gemma2b to perform a simple test on outlines; however, I unfortunately encountered the following error.
Steps/code to reproduce the bug:
Expected result:
.
Error message:
No response
Outlines/Python version information:
Version information
Context for the issue:
No response
The text was updated successfully, but these errors were encountered: