Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature(dspy): dspy-integration-with-langfuse #1186

Open
wants to merge 20 commits into
base: main
Choose a base branch
from

Conversation

xucailiang
Copy link

@xucailiang xucailiang commented Jun 21, 2024

As mentioned in issue #1179 , I wrote some code to integrate langfuse.
when configured correctly, It will support the display of data from OpenAI, AzureOpenAI and Ollama on the Langfuse dashboard

@xucailiang xucailiang changed the title dspy-integration-with-langfuse feature(dspy)-dspy-integration-with-langfuse Jun 22, 2024
@xucailiang xucailiang changed the title feature(dspy)-dspy-integration-with-langfuse feature(dspy): dspy-integration-with-langfuse Jun 22, 2024
Copy link

@denisergashbaev denisergashbaev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @xucailiang I have left some comments. I am learning DSPy too and just wondering whether the proposed integration could be done in a more elegant way

dsp/modules/gpt3.py Outdated Show resolved Hide resolved
dsp/modules/ollama.py Outdated Show resolved Hide resolved
dsp/modules/ollama.py Outdated Show resolved Hide resolved
@xucailiang xucailiang closed this Jun 24, 2024
@xucailiang xucailiang reopened this Jun 24, 2024
@xucailiang
Copy link
Author

xucailiang commented Jun 24, 2024

Hey @xucailiang I have left some comments. I am learning DSPy too and just wondering whether the proposed integration could be done in a more elegant way

I have rewritten the code, the new calling method is more elegant, I realize that all LLM reasoning can be collected to langfuse, but it is difficult for me to test all the code. I have already tested my submission, maybe we can continue to integrate other LLM modules after this PR is merged

xucai added 4 commits June 25, 2024 14:48
…e(dspy)-dspy-integration-with-langfuse

# Conflicts:
#	dsp/modules/azure_openai.py
@denisergashbaev
Copy link

@xucailiang hey! I will check it out this weekend

@xucailiang
Copy link
Author

@xucailiang hey! I will check it out this weekend

Thanks for your time, I'm not sure if we'll wrap up this pull request over the weekend. If you have any suggestions or ideas to improve this, please feel free to share them with me at your earliest convenience. I'd really appreciate it!

@xucailiang
Copy link
Author

@xucailiang hey! I will check it out this weekend

hi,I wonder if this discussion is still ongoing?

@canada4663
Copy link

+1, also interested in this!

@denisergashbaev
Copy link

@xucailiang hey! I will check it out this weekend

hi,I wonder if this discussion is still ongoing?

Hi! Unfortunately I have not had time to run it locally yet. I will do it. However, I am not a DSPy maintainer, just another user, so all my comments are just my comments :) I wonder whether it would make sense to ping one of the contributers so that they can take a look as well

@xucailiang
Copy link
Author

@arnavsinghvi11 hi,I was wondering if you could review this PR so I can continue integrating langfuse on other modules.Thank you.

@arnavsinghvi11
Copy link
Collaborator

Hi @xucailiang , thanks for opening this PR! Exciting to see the Langfuse integration with DSPy.

The PR looks on track so far, but I agree with your point that it's difficult to add this for all LM providers. I would suggest exploring ways to wrap the Langfuse logic not to each provider class but rather with the base lm.py class so any user using any LM provider can declare the langfuse tracker as an arg and then have it popped before the LM request is made.

Let me know if that makes sense and if you have any additional questions. Feel free to mention me here again when the PR is ready to review as well!

@xucailiang
Copy link
Author

xucailiang commented Jul 9, 2024

Hi @xucailiang , thanks for opening this PR! Exciting to see the Langfuse integration with DSPy.

The PR looks on track so far, but I agree with your point that it's difficult to add this for all LM providers. I would suggest exploring ways to wrap the Langfuse logic not to each provider class but rather with the base lm.py class so any user using any LM provider can declare the langfuse tracker as an arg and then have it popped before the LM request is made.

Let me know if that makes sense and if you have any additional questions. Feel free to mention me here again when the PR is ready to review as well!

Indeed, I have modified some of the code for this, I am glad you pointed out this oversight.

And what I want to discuss is, do you mean that you want to put the tracker in lm's self.kwargs? However, I put the tracker outside of self.kwargs because I didn't think it was a necessary parameter for LLM. So we don't have to pop it.

Or is there something extra I don't know about?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants