Skip to content

Releases: taketwo/llm-ollama

0.5.0

31 Jul 07:40
bb3e92b
Compare
Choose a tag to compare
  • Add support for forcing the model to reply with a valid JSON object

0.4.3

02 Jul 04:49
40c6600
Compare
Choose a tag to compare
  • Fix the type of stop option. This allows using it through the llm Python API; however, it's not clear how to pass it through the CLI.

0.4.2

12 Jun 09:53
e163fc5
Compare
Choose a tag to compare
  • Ignore KeyError when iterating through response messages in streaming mode

0.4.1

29 May 19:59
55bf578
Compare
Choose a tag to compare
  • Prevent inability to communicate with Ollama server from failing the entire llm CLI

0.4.0

22 May 07:10
Compare
Choose a tag to compare
  • Add missing pydantic dependency

0.3.0

07 May 08:19
97359cf
Compare
Choose a tag to compare

0.2.0

28 Jan 07:36
62ea462
Compare
Choose a tag to compare
  • Switch to using official Ollama Python library instead of raw HTTP API
  • Automatically create aliases for identical models with different names

0.1.0

20 Jan 15:11
5e8cea2
Compare
Choose a tag to compare

Initial release.