AgentFlow is a minimalist, async-first Rust framework for building, orchestrating, and managing AI agents and workflows. It is designed for rapid prototyping and production deployment of agentic, RAG, and multi-agent systems, with a focus on composability, extensibility, and language-agnostic patterns.
flowchart TD
subgraph AgentFlow
A1[Agent]
A2[Agent]
A3[Agent]
WF[Workflow/Flow Engine]
STORE[Shared Store]
end
User((User))
User -- API/CLI --> WF
WF -- manages --> A1
WF -- manages --> A2
WF -- manages --> A3
A1 -- reads/writes --> STORE
A2 -- reads/writes --> STORE
A3 -- reads/writes --> STORE
- Agents: Specialized async units (LLM, RAG, tool-calling, etc.) that process tasks.
- Workflow/Flow Engine: Chains agents into flexible, configurable pipelines with conditional routing.
- Shared Store: Central, thread-safe data structure for passing state/results between agents.
Add to your Cargo.toml:
[dependencies]
agentflow = "0.1"
tokio = { version = "1", features = ["full"] }
serde_json = "1"An Agent is an autonomous async decision-making unit. It wraps a node (async function) and can be retried on failure.
use agentflow::prelude::*;
use std::collections::HashMap;
#[tokio::main]
async fn main() {
let mut store = HashMap::new();
store.insert("name".to_string(), serde_json::Value::String("Alice".to_string()));
let hello_agent = create_node(|mut store: SharedStore| {
Box::pin(async move {
let name = store.get("name").and_then(|v| v.as_str()).unwrap_or("stranger");
store.insert("greeting".to_string(), serde_json::Value::String(format!("Hello, {}!", name)));
store
})
});
let agent = Agent::new(hello_agent);
let result = agent.call(store).await;
println!("{}", result.get("greeting").unwrap());
}flowchart LR
InputStore[Input SharedStore] -->|call| Agent
Agent -->|returns| OutputStore[Output SharedStore]
A Workflow chains agents (nodes) into a directed graph, supporting conditional routing and branching.
use agentflow::prelude::*;
use std::collections::HashMap;
#[tokio::main]
async fn main() {
let mut workflow = Workflow::new();
workflow.add_step("step1", create_node(|mut store| {
Box::pin(async move {
store.insert("step1".to_string(), "Research done".into());
store.insert("action".to_string(), "default".into());
store
})
}));
workflow.add_step("step2", create_node(|mut store| {
Box::pin(async move {
store.insert("step2".to_string(), "Code generated".into());
store.insert("action".to_string(), "default".into());
store
})
}));
workflow.connect("step1", "step2");
let store = HashMap::new();
let result = workflow.run(store).await;
println!("{:?}", result);
}flowchart LR
Start[Start] --> Step1[Node: step1]
Step1 --> Step2[Node: step2]
Step2 --> End[End]
Step1 -- reads/writes --> Store1[SharedStore]
Step2 -- reads/writes --> Store2[SharedStore]
MultiAgent runs multiple agents in parallel, each responsible for a different part of a software project. Results are merged into a single shared store.
/*!
Runs multiple agents in parallel, each responsible for a different part of a software project.
*/
let mut multi_agent = MultiAgent::new();
multi_agent.add_agent(agent1);
multi_agent.add_agent(agent2);
let result = multi_agent.run(store).await;flowchart LR
InputStore[Input SharedStore] --> Agent1
InputStore --> Agent2
Agent1 --> Merge[Merge Results]
Agent2 --> Merge
Merge --> OutputStore[Output SharedStore]
RAG composes a retriever node and a generator node, passing the shared store through both.
use agentflow::prelude::*;
use std::collections::HashMap;
#[tokio::main]
async fn main() {
let mut store = HashMap::new();
store.insert("query".to_string(), "Rust web frameworks".into());
let retriever = create_node(|mut store: SharedStore| {
Box::pin(async move {
let query = store.get("query").and_then(|v| v.as_str()).unwrap_or("");
store.insert("context".to_string(), format!("Docs for: {}", query).into());
store
})
});
let generator = create_node(|mut store: SharedStore| {
Box::pin(async move {
let context = store.get("context").and_then(|v| v.as_str()).unwrap_or("");
store.insert("response".to_string(), format!("Summary: {}", context).into());
store
})
});
let rag = Rag::new(retriever, generator);
let result = rag.call(store).await;
println!("{}", result.get("response").unwrap());
}flowchart LR
InputStore[Input SharedStore] --> Retriever
Retriever --> Generator
Generator --> OutputStore[Output SharedStore]
MapReduce batch processes documents, summarizes each, and aggregates results.
/*!
Batch process documents, summarize each, and aggregate results.
*/
let map_reduce = MapReduce::new(batch_mapper, reducer);
let result = map_reduce.run(inputs).await;flowchart LR
InputBatch[Vec<SharedStore>] --> Mapper
Mapper --> MappedBatch[Vec<SharedStore>]
MappedBatch --> Reducer
Reducer --> OutputStore[SharedStore]
- Add new agent types: Implement the
Nodetrait. - Compose custom workflows: Use the
WorkfloworFlowAPI to chain, branch, or parallelize agents. - Integrate external tools: Wrap API calls as nodes/agents.
- Composable: Build complex systems from simple, reusable async parts.
- Async-first: Designed for async/await and concurrent execution.
- Minimalist: Focus on core abstractions, not vendor lock-in.
- See
/examples/for more Rust usage. - See
/src/patterns/for built-in patterns. - See
/src/core/for core abstractions. - See
/src/utils/for integration stubs (LLM, search, embeddings, etc).
MIT OR Apache-2.0
Happy agentic building!
AgentFlow is a minimalist, async-first Rust framework for building, orchestrating, and managing AI agents and workflows. It is designed for composability, extensibility, and real-world LLM applications.
- Agent: Autonomous async decision-making unit with retry logic.
- Workflow: Chain of agents with conditional routing and branching.
- MultiAgent: Parallel or coordinated agent execution.
- RAG: Retrieval-Augmented Generation (retriever + generator).
- MapReduce: Batch map and reduce over data.
- Composable: Build complex systems from simple, reusable async parts.
- Async-first: Designed for async/await and concurrent execution.
Add to your Cargo.toml:
agentflow = "0.1"
rig = "0.1"use agentflow::prelude::*;
use rig::prelude::*;
use std::collections::HashMap;
use std::sync::{Arc, Mutex};
#[tokio::main]
async fn main() {
// Create a simple agent node
let agent_node = create_node(|store: SharedStore| {
Box::pin(async move {
let prompt = store.lock().unwrap().get("prompt").and_then(|v| v.as_str()).unwrap_or("").to_string();
// ... LLM call here ...
store.lock().unwrap().insert("response".to_string(), Value::String("Hello!".to_string()));
store
})
});
let agent = Agent::with_retry(agent_node, 3, 1000);
let mut store = HashMap::new();
store.insert("prompt".to_string(), Value::String("Say hello!".to_string()));
let result = agent.decide(store).await;
println!("{:?}", result);
}flowchart LR
InputStore -->|call| Agent
Agent -->|decide| OutputStore
flowchart LR
Start[Start] --> Step1[Node 1]
Step1 --> Step2[Node 2]
Step2 --> End[End]
/*!
Runs multiple agents in parallel, each responsible for a different part of a software project.
*/
let mut multi_agent = MultiAgent::new();
multi_agent.add_agent(agent1);
multi_agent.add_agent(agent2);
let result = multi_agent.run(store).await;/*!
Retrieval-Augmented Generation pipeline.
*/
let rag = Rag::new(retriever, generator);
let result = rag.call(store).await;/*!
Batch process documents, summarize each, and aggregate results.
*/
let map_reduce = MapReduce::new(batch_mapper, reducer);
let result = map_reduce.run(inputs).await;All examples are in the examples/ directory.
You must have Rust and Cargo installed.
Some examples require API keys for LLM providers (e.g., OpenAI, Gemini) and the rig-core crate.
Set your environment variables as needed for your LLM provider(s):
export OPENAI_API_KEY=sk-...
export GEMINI_API_KEY=...From the project root:
cargo build --allIf you want to run examples that use the rig-core crate, ensure you have network access and the correct API keys.
You can run any example using Cargo.
From the project root, use:
cargo run --example agent
cargo run --example async-agent
cargo run --example workflow
cargo run --example rag
cargo run --example multi-agent
cargo run --example mapreduce
cargo run --example orchestrator-multi-agent
cargo run --example structured-outputOr, to see a list of all available examples:
cargo run --example <example-name>agentβ Run a single LLM-powered agent with retry logic.async-agentβ Run two agents concurrently (async/parallel).workflowβ Multi-step workflow with human-in-the-loop (HITL) at each step.ragβ Retrieval-Augmented Generation: retrieve context, then generate an answer.multi-agentβ Run multiple agents in parallel, merging results into a shared store.mapreduceβ Batch process documents, summarize each, and aggregate results.orchestrator-multi-agentβ Orchestrator agent coordinates a multi-phase, multi-role workflow.structured-outputβ Multi-agent, interactive TUI pipeline for research, summarization, and critique, with structured output.
cargo run --example agentcargo run --example workflowMIT OR Apache-2.0