Skip to content

how can Agenta be integrated with ollama LLM platform #1889

Answered by mmabrouk
kmx2 asked this question in Q&A
Discussion options

You must be logged in to vote

Hey @kmx2 ,

This should be simple to do. You need to create a custom application that calls your local ollama endpoint. You can see an example on how to create such a custom application here: https://docs.agenta.ai/guides/tutorials/deploy-mistral-model
In this case we are calling Mistral in Huggingface, but you can do exactly the same by calling Ollama locally.

Replies: 3 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by mmabrouk
Comment options

You must be logged in to vote
1 reply
@mmabrouk
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
question Further information is requested
2 participants
Converted from issue

This discussion was converted from issue #1887 on July 15, 2024 10:45.