Skip to content

matthoffner/llama-use-chat

Repository files navigation

llama-use-chat

Example of using Vercel's useChat with Llama.cpp /completion route. Handles formatting the data to match the OpenAI spec along with the ability to configure Llama.cpp parameters.

npm install
LLAMA_SERVER_URL=https://huggingface.co/spaces/matthoffner/ggml-coding-cpu/completion npm run dev

LLAMA_SERVER_URL defaults to http://127.0.0.1:8080/completion

About

Example using Llama.cpp with useChat hook

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published