Releases: run-llama/llama_index
Releases · run-llama/llama_index
v0.4.23
LlamaIndex 0.4.23 🦙 brings some exciting new functionality:
- Upgraded LlamaIndex memory module <> langchain agent.
- Added “empty” Index - more explicitly combine prior knowledge with knowledge corpus.
- 🚢 Steamship file reader (thanks @douglas-reid)
v0.4.22.post1
v0.4.22.post1
v0.4.22
LlamaIndex 0.4.22: a LOT of quality-of-life improvements 💪:
- Fixed empty text split issue
- Reduce required deps in package (s/o Ajinkya)
- Simplify UX for custom prompts
- Simply UX for SQL Index
v0.4.21
v0.4.20
LlamaIndex v0.4.20
v0.4.19
Adds streaming support!
v0.4.18
LlamaIndex v0.4.18
v0.4.17
LlamaIndex v0.4.17
v0.4.16
LlamaIndex 0.4.16:
- Elastic data loader + Opensearch based vector index! (Huge s/o @jaylmiller )
- Add child branch factor for tree index embedding queries (thanks @kpister )
v0.4.15
- Biggest update is that we’ve fixed the composability over vector stores
- Added @weaviate_io multi-threaded batch importing to decrease load time