Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix completions endpoint crash #1196

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

XenonMolecule
Copy link
Collaborator

@XenonMolecule XenonMolecule commented Jun 23, 2024

The OpenAI completions endpoint calls will crash when the stop tokens are specified as a list. This is the same issue with the chat endpoint when the chat messages were a non-hashable type. The solution for the chat endpoint was to stringify the kwargs so that the cache could hash the kwargs. Then the string would be parsed back into JSON.

This PR implements the same logic for the completions endpoints.

There are two potential drawbacks I see here:
(1) This will invalidate the cache of anyone who still uses the completions endpoints in DSPy
(2) The completions endpoints are no longer supported by OpenAI and are being phased out, so maybe we should instead focus on removing support from DSPy.

A potentially alternative solution would be to only stringify the kwargs if there is an unhashable type in the kwargs. This would be backwards compatible with old caches, but with the drawback that this check for hashable types would incur a delay on each call.

@okhat
Copy link
Collaborator

okhat commented Jun 27, 2024

Hey this will mess with all caches?? Not a good idea now.

Can you convert lists to tuples before the cached function is called?

@okhat
Copy link
Collaborator

okhat commented Jul 6, 2024

re-tagging @XenonMolecule for the note above

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants