Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] what's the prompt generated from Guard.from_pydantic #919

Open
jack2684 opened this issue Jul 14, 2024 · 1 comment
Open

[feat] what's the prompt generated from Guard.from_pydantic #919

jack2684 opened this issue Jul 14, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@jack2684
Copy link

jack2684 commented Jul 14, 2024

Description
When using following code, I don't know what will be sent to the llm endpoint.

guard = Guard.from_pydantic(output_class=Pet, prompt=prompt)

raw_output, validated_output, *rest = guard(
    llm_api=openai.completions.create,
    engine="gpt-3.5-turbo-instruct"
)

Even the history doesn't show the, not to mention I don't want to view prompt only after send it successfully

guard_llm.history.last.compiled_instructions

This will output somehting like

You are a helpful assistant, able to express yourself purely through JSON, strictly and precisely adhering to the provided XML schemas.

I don't see where is provided XML schemas.

Why is this needed
I would like to get the original prompt for debugging purpose.

Implementation details
[If known, describe how this change should be implemented in the codebase]

End result

guard = Guard.from_pydantic(output_class=Pet, prompt=prompt)
guard.complied_prompt_to_be_sent
@jack2684 jack2684 added the enhancement New feature or request label Jul 14, 2024
@dtam
Copy link
Contributor

dtam commented Aug 5, 2024

hi @jack2684 this should be available at guard.history.last.compiled_prompt Could you happen to share where you got the guard_llm.history.last.compiled_instructions reference so we could update the docs accordingly? Thanks!

@dtam dtam self-assigned this Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants