Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature(dspy): dspy-integration-with-langfuse #1186

Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
5d2728b
dspy-integration-with-langfuse
Jun 21, 2024
7941e3a
ollama support langfuse
Jun 22, 2024
8d02e99
ollama support langfuse
Jun 22, 2024
3890b9d
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jun 24, 2024
1164399
new BaseTracker && use LangfuseTracker
Jun 24, 2024
6063d0f
new BaseTracker && use LangfuseTracker && edit README.md
Jun 24, 2024
479e7a1
new BaseTracker && new LangfuseTracker && edit README.md
Jun 24, 2024
d000151
new BaseTracker && new LangfuseTracker && edit README.md
Jun 24, 2024
9f33208
new BaseTracker && new LangfuseTracker && edit README.md
Jun 24, 2024
c34a323
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jun 25, 2024
7dbeedb
langfuse:think of kwargs as metadata
Jun 25, 2024
16d6934
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jun 26, 2024
4d10a2d
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jun 28, 2024
2c78181
support tracker_decorator
Jul 4, 2024
f0fadac
support tracker_decorator
Jul 4, 2024
4341f2d
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jul 4, 2024
b23818b
support tracker_decorator
Jul 4, 2024
3f72c09
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jul 9, 2024
12af731
LM module adds tracker parameters & Mapping kwargs to metadata
Jul 9, 2024
b4ad853
tracker should be placed outside of kwargs(issubclass(args[0].tracker…
Jul 9, 2024
5aca3f2
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Jul 31, 2024
1af665f
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Aug 2, 2024
2ae10c7
Supports users to manually call the tracker.
Aug 3, 2024
0243368
Support manual call of tracker & supplementary document
Aug 4, 2024
6661e19
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Aug 9, 2024
2a73b0d
Support langfuse & add md file
Aug 13, 2024
9db9564
Merge branch 'main' of https://github.com/xucailiang/dspy into featur…
Aug 13, 2024
b9aa5b7
Support langfuse & add md file
Aug 14, 2024
c0853ec
Support langfuse & add md file
Aug 14, 2024
ea3d37a
Support langfuse & add md file
Aug 14, 2024
e542950
Update about_prompt_visible.md
arnavsinghvi11 Aug 15, 2024
bc0af45
Update gpt3.py
arnavsinghvi11 Aug 15, 2024
47a7c0a
Update azure_openai.py
arnavsinghvi11 Aug 15, 2024
500ff5e
Update azure_openai.py
arnavsinghvi11 Aug 15, 2024
78afb06
Update gpt3.py
arnavsinghvi11 Aug 15, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,12 +75,18 @@ Or open our intro notebook in Google Colab: [<img align="center" src="https://co

By default, DSPy installs the latest `openai` from pip. However, if you install old version before OpenAI changed their API `openai~=0.28.1`, the library will use that just fine. Both are supported.

For the optional (alphabetically sorted) [Chromadb](https://github.com/chroma-core/chroma), [Groq](https://github.com/groq/groq-python), [Marqo](https://github.com/marqo-ai/marqo), [Milvus](https://github.com/milvus-io/milvus), [MongoDB](https://www.mongodb.com), [MyScaleDB](https://github.com/myscale/myscaledb), Pinecone, [Qdrant](https://github.com/qdrant/qdrant), [Snowflake](https://github.com/snowflakedb/snowpark-python), or [Weaviate](https://github.com/weaviate/weaviate) retrieval integration(s), include the extra(s) below:
For the optional (alphabetically sorted) [Chromadb](https://github.com/chroma-core/chroma), [Groq](https://github.com/groq/groq-python), [Marqo](https://github.com/marqo-ai/marqo), [Milvus](https://github.com/milvus-io/milvus), [MongoDB](https://www.mongodb.com), [MyScaleDB](https://github.com/myscale/myscaledb), Pinecone, [Qdrant](https://github.com/qdrant/qdrant), [Snowflake](https://github.com/snowflakedb/snowpark-python), or [Weaviate](https://github.com/weaviate/weaviate) , [Langfuse](https://langfuse.com/) retrieval integration(s), include the extra(s) below:

```
pip install dspy-ai[chromadb] # or [groq] or [marqo] or [milvus] or [mongodb] or [myscale] or [pinecone] or [qdrant] or [snowflake] or [weaviate]
pip install dspy-ai[chromadb] # or [groq] or [marqo] or [milvus] or [mongodb] or [myscale] or [pinecone] or [qdrant] or [snowflake] or [weaviate] or [langfuse]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we actually add this to the DSPy docs instead of in the README? Feel free to add this to https://github.com/stanfordnlp/dspy/tree/main/docs/api/language_model_clients

```

langfuse is now supported(openAI、AzureOpenAI and Ollama)!

Before you configure langfuse, please manually deploy the langfuse server or use Langfuse Cloud. You will get the corresponding configuration after you create a new project.
When please configure the relevant environment variables in the project, they are `LANGFUSE_SECRET_KEY`、`LANGFUSE_PUBLIC_KEY` and `LANGFUSE_HOST`.
Just write the environment variables and langfuse will automatically read them.[langfuse details](https://langfuse.com/docs/deployment/self-host) .

## 2) Documentation

The DSPy documentation is divided into **tutorials** (step-by-step illustration of solving a task in DSPy), **guides** (how to use specific parts of the API), and **examples** (self-contained programs that illustrate usage).
Expand Down
15 changes: 13 additions & 2 deletions dsp/modules/azure_openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,20 @@
import json
import logging
from typing import Any, Literal, Optional, cast

import backoff
import openai

try:
"""
langfuse has made it compatible for us.
If there is any error in the langfuse configuration, it will turn to request the real address(openai or azure endpoint)
"""
import langfuse
from langfuse.openai import openai
logging.info(f"You are using Langfuse,version{langfuse.__version__}")
except:
import openai
logging.info(f"You are using openai,version{openai.version.__version__}")


from dsp.modules.cache_utils import CacheMemory, NotebookCacheMemory, cache_turn_on
from dsp.modules.lm import LM
Expand Down
14 changes: 12 additions & 2 deletions dsp/modules/gpt3.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,19 @@
import json
import logging
from typing import Any, Literal, Optional, cast

import backoff
import openai

try:
"""
langfuse has made it compatible for us.
xucailiang marked this conversation as resolved.
Show resolved Hide resolved
If there is any error in the langfuse configuration, it will turn to request the real address(openai or azure endpoint)
"""
import langfuse
from langfuse.openai import openai
logging.info(f"You are using Langfuse,version{langfuse.__version__}")
except:
import openai
logging.info(f"You are using openai,version{openai.version.__version__}")

from dsp.modules.cache_utils import CacheMemory, NotebookCacheMemory, cache_turn_on
from dsp.modules.lm import LM
Expand Down
17 changes: 16 additions & 1 deletion dsp/modules/ollama.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,19 @@
import datetime
import hashlib
import uuid
from typing import Any, Literal

import requests

from dsp.modules.lm import LM

try:
xucailiang marked this conversation as resolved.
Show resolved Hide resolved
from langfuse import Langfuse
# If you need higher performance, set the "thread" value
langfuse = Langfuse(max_retries=2)
except:
LANGFUSE = False


def post_request_metadata(model_name, prompt):
"""Creates a serialized request object for the Ollama API."""
Expand Down Expand Up @@ -134,7 +142,14 @@ def basic_request(self, prompt: str, **kwargs):
"raw_kwargs": raw_kwargs,
}
self.history.append(history)

if LANGFUSE:
langfuse.trace(
xucailiang marked this conversation as resolved.
Show resolved Hide resolved
name="Ollama request",
user_id=str(uuid.uuid4()),
metadata={**settings_dict['options'], **request_info["usage"]},
input=prompt,
output=request_info['choices']
)
return request_info

def request(self, prompt: str, **kwargs):
Expand Down
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,8 @@
"google-vertex-ai": ["google-cloud-aiplatform==1.43.0"],
"myscale":["clickhouse-connect"],
"groq": ["groq~=0.8.0"],
},
"langfuse": ["langfuse~=2.36.1"]
},
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Science/Research",
Expand Down