Skip to content

The model-agnostic framework for building scalable agentic workflows

License

Notifications You must be signed in to change notification settings

jezekra1/bee-agent-framework

 
 

Repository files navigation

Bee Agent Framework

Open-source framework for building, deploying, and serving powerful agentic workflows at scale.

The Bee framework makes it easy to build agentic worfklows with leading proprietary and open-source models. We’re working on bringing model-agnostic support to any LLM to help developers avoid model provider lock-in and embrace the latest open-source LLMs.

Key Features

  • 🤖 AI agents: Use our powerful Bee agent or build your own.
  • 🛠️ Tools: Use our built-in tools or create your own in Javascript/Python.
  • 👩‍💻 Code interpreter: Run code safely in a sandbox container.
  • 💾 Memory: Multiple strategies to optimize token spend.
  • ⏸️ Serialization Handle complex agentic workflows and easily pause/resume them without losing state.
  • 🔍 Traceability: Get full visibility of your agent’s inner workings, log all running events, and use our MLflow integration (coming soon) to debug performance.
  • 🎛️ Production-level control with caching and error handling.
  • 🚧 (Coming soon) Evaluation: Run evaluation jobs with your own data source (custom csv or Airtable).
  • 🚧 (Coming soon) Model-agnostic support: Change model providers in 1 line of code without breaking your agent’s functionality.
  • 🚧 (Coming soon) Chat UI: Serve your agent to users in a delightful GUI with built-in transparency, explainability, and user controls.
  • ... more on our Roadmap

Get started with Bee

Installation

npm install bee-agent-framework

or

yarn add bee-agent-framework

Example

import { BeeAgent } from "bee-agent-framework/agents/bee/agent";
import { OllamaChatLLM } from "bee-agent-framework/adapters/ollama/chat";
import { TokenMemory } from "bee-agent-framework/memory/tokenMemory";
import { DuckDuckGoSearchTool } from "bee-agent-framework/tools/search/duckDuckGoSearch";
import { OpenMeteoTool } from "bee-agent-framework/tools/weather/openMeteo";

const llm = new OllamaChatLLM(); // default is llama3.1 (7b), it is recommended to use 70b model
const agent = new BeeAgent({
  llm, // for more explore 'bee-agent-framework/adapters'
  memory: new TokenMemory({ llm }), // for more explore 'bee-agent-framework/memory'
  tools: [new DuckDuckGoSearchTool(), new OpenMeteoTool()], // for more explore 'bee-agent-framework/tools'
});

const response = await agent
  .run({ prompt: "What's the current weather in Las Vegas?" })
  .observe((emitter) => {
    emitter.on("update", async ({ data, update, meta }) => {
      console.log(`Agent (${update.key}) 🤖 : `, update.value);
    });
  });

console.log(`Agent 🤖 : `, response.result.text);

➡️ See a more advanced example.

➡️ All examples can be found in the examples directory.

Local Installation (Python Interpreter + Interactive CLI)

Note: yarn should be installed via Corepack (tutorial)

Note: To make any asset available to a local code interpreter place them the following directory: ./examples/tmp/local

Note: Docker distribution with support for compose is required, the following are supported:

  • Rancher - recommended
  • Docker
  • Podman - requires compose and rootful machine (if your current machine is rootless, please create a new one)
  1. Clone the repository git clone git@github.com:i-am-bee/bee-agent-framework.
  2. Install dependencies yarn install.
  3. Create .env (from .env.template) and fill in missing values (if any).
  4. Start the code interpreter yarn run infra:start-code-interpreter.
  5. Start the agent yarn run start:bee (it runs ./examples/agents/bee.ts file).

🛠️ Tools

Name Description
PythonTool Run arbitrary Python code in the remote environment.
WikipediaTool Search for data on Wikipedia.
DuckDuckGoTool Search for data on DuckDuckGo.
LLMTool Uses an LLM to process input data.
DynamicTool Construct to create dynamic tools.
ArXivTool Retrieves research articles published on arXiv.
WebCrawlerTool Retrieves content of an arbitrary website.
CustomTool Runs your own Python function in the remote environment.
OpenMeteoTool Retrieves current, previous, or upcoming weather for a given destination.
Request

🔌️ Adapters (LLM - Inference providers)

Name Description
Ollama LLM + ChatLLM support (example)
LangChain Use any LLM that LangChain supports (example)
WatsonX LLM + ChatLLM support (example)
BAM (IBM Internal) LLM + ChatLLM support (example)
Request

📦 Modules

The source directory (src) provides numerous modules that one can use.

Name Description
agents Base classes defining the common interface for agent.
llms Base classes defining the common interface for text inference (standard or chat).
template Prompt Templating system based on Mustache with various improvements_.
memory Various types of memories to use with agent.
tools Tools that an agent can use.
cache Preset of different caching approaches that can be used together with tools.
errors Base framework error classes used by each module.
adapters Concrete implementations of given modules for different environments.
logger Core component for logging all actions within the framework.
serializer Core component for the ability to serialize/deserialize modules into the serialized format.
version Constants representing the framework (e.g., latest version)
internals Modules used by other modules within the framework.

To see more in-depth explanation see docs.

Tutorials

🚧 Coming soon 🚧

Roadmap

  • Evaluation with MLFlow integration
  • JSON encoder/decoder for model-agnostic support
  • Chat Client (GUI)
  • Structured outputs
  • Improvements to base Bee agent
  • Guardrails
  • 🚧 TBD 🚧

Contribution guidelines

The Bee Agent Framework is an open-source project and we ❤️ contributions.

Feature contributions

You can get started with any ticket market as “good first issue”.

Have an idea for a new feature? We recommend you first talk to a maintainer prior to spending a lot of time making a pull request that may not align with the project roadmap.

Bugs

We are using GitHub Issues to manage our public bugs. We keep a close eye on this, so before filing a new issue, please check to make sure it hasn't already been logged.

Code of conduct

This project and everyone participating in it are governed by the Code of Conduct. By participating, you are expected to uphold this code. Please read the full text so that you can read which actions may or may not be tolerated.

Legal notice

All content in these repositories including code has been provided by IBM under the associated open source software license and IBM is under no obligation to provide enhancements, updates, or support. IBM developers produced this code as an open source project (not as an IBM product), and IBM makes no assertions as to the level of quality nor security, and will not be maintaining this code going forward.

About

The model-agnostic framework for building scalable agentic workflows

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 98.7%
  • Other 1.3%