Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openai: raw response headers #24150

Merged
merged 14 commits into from
Jul 16, 2024
Merged

openai: raw response headers #24150

merged 14 commits into from
Jul 16, 2024

Conversation

efriis
Copy link
Member

@efriis efriis commented Jul 11, 2024

No description provided.

@efriis efriis added the partner label Jul 11, 2024
Copy link

vercel bot commented Jul 11, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Jul 15, 2024 11:58pm

@efriis efriis self-assigned this Jul 11, 2024
@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Jul 11, 2024
@efriis efriis marked this pull request as draft July 11, 2024 21:27
@efriis efriis changed the title openai: raw response headers [wip] openai: raw response headers Jul 11, 2024
@dosubot dosubot bot added 🔌: openai Primarily related to OpenAI integrations 🤖:improvement Medium size change to existing code to handle new use-cases labels Jul 11, 2024
@@ -510,7 +512,9 @@ def _stream(
kwargs["stream"] = True
payload = self._get_request_payload(messages, stop=stop, **kwargs)
default_chunk_class: Type[BaseMessageChunk] = AIMessageChunk
with self.client.create(**payload) as response:
with self.client.with_raw_response.create(**payload) as raw_response:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tradeoff: this code is more maintainable, but it won't tolerate custom client/async_client that doesn't implement with_raw_response (i.e. non-openai clients)

Could make it tolerate this by only raw-responsing if include_response_headers is set to true.

I'm generally in favor of maintainability here. Happy to discuss.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

switching back to not break things

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these were just copypasta from openai but rely on openai package internals. delete.

@efriis efriis requested a review from baskaryan July 15, 2024 20:46
@efriis efriis marked this pull request as ready for review July 15, 2024 20:46
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. langchain Related to the langchain package and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels Jul 15, 2024
@efriis efriis changed the title [wip] openai: raw response headers openai: raw response headers Jul 15, 2024
@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Jul 16, 2024
@efriis efriis merged commit 1e9cc02 into master Jul 16, 2024
32 checks passed
@efriis efriis deleted the erick/openai-raw-response-headers branch July 16, 2024 16:54
ccurme added a commit that referenced this pull request Jul 17, 2024
Fixes small issues introduced in
#24150 (unreleased).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases langchain Related to the langchain package lgtm PR looks good. Use to confirm that a PR is ready for merging. 🔌: openai Primarily related to OpenAI integrations partner size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants