Skip to content

StreamDeploy offers this free LLM application template leveraging Ollama as the LLM service provider. This llm scaffold leverages Sveltekit for the frontend, Python for the backend, and Ollama as llm service provider.

Notifications You must be signed in to change notification settings

StreamDeploy-DevRel/streamdeploy-llm-app-scaffold

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

StreamDeploy LLM App Scaffold

This repo contains a pre-architected, production-ready LLM application that may be easily deployed to cloud thanks tothe independent containers used for the frontend, backend, and LLM service Ollama.

Run the following from the root of the llm-app-scaffold directory:

docker-compose up --build

To use a model from Ollama, run Ollama pull. For example, to use mistral, run the following:

ollama pull mistral

then run the docker compose command

LLM Application Scaffold Screenshot

About

StreamDeploy offers this free LLM application template leveraging Ollama as the LLM service provider. This llm scaffold leverages Sveltekit for the frontend, Python for the backend, and Ollama as llm service provider.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published