Unite and orchestrate AI agents - A production-ready ADK for building agentic AI applications in Node.js.
AgentSea ADK unites AI agents and services to create powerful, intelligent applications and integrations.
- π€ Multi-Provider Support - Anthropic Claude, OpenAI GPT, Google Gemini
- π― Per-Model Type Safety - Compile-time validation of model-specific options
- π Local & Open Source Models - Ollama, LM Studio, LocalAI, Text Generation WebUI, vLLM
- ποΈ Voice Support (TTS/STT) - OpenAI Whisper, ElevenLabs, Piper TTS, Local Whisper
- π MCP Protocol - First-class Model Context Protocol integration
- π ACP Protocol - Agentic Commerce Protocol for e-commerce integration
- π Multi-Agent Crews - Role-based coordination with delegation strategies
- π¬ Conversation Schemas - Structured conversational experiences with validation
- π§ Advanced Memory - Episodic, semantic, and working memory with multi-agent sharing
- π§ Built-in Tools - 8 production-ready tools + custom tool support
- π‘οΈ Guardrails - Content safety, prompt injection detection, PII filtering, and validation
- π LLM Evaluation - Metrics, LLM-as-Judge, human feedback, and continuous monitoring
- π LLM Gateway - OpenAI-compatible API with intelligent routing, caching, and cost optimization
- π Embeddings - Multi-provider embeddings with caching and quality metrics
- π Structured Output - TypeScript-native Zod schema enforcement for LLM responses
- π₯ Document Ingestion - Flexible pipeline with parsers, chunkers, and transformers
- π Browser Automation - Web agents with Playwright, Puppeteer, and native backends
- π Full Observability - Logging, metrics, distributed tracing, and cost tracking
- π― NestJS Integration - Decorators, modules, and dependency injection
- π REST API & Streaming - HTTP endpoints, SSE streaming, WebSocket support
- π Production Ready - Rate limiting, caching, error handling, retries
- π TypeScript - Fully typed with comprehensive definitions
# Core package (framework-agnostic)
pnpm add @lov3kaizen/agentsea-core
# NestJS integration
pnpm add @lov3kaizen/agentsea-nestjsimport {
Agent,
AnthropicProvider,
ToolRegistry,
BufferMemory,
calculatorTool,
} from '@lov3kaizen/agentsea-core';
// Create agent
const agent = new Agent(
{
name: 'assistant',
model: 'claude-sonnet-4-20250514',
provider: 'anthropic',
systemPrompt: 'You are a helpful assistant.',
tools: [calculatorTool],
},
new AnthropicProvider(process.env.ANTHROPIC_API_KEY),
new ToolRegistry(),
new BufferMemory(50),
);
// Execute
const response = await agent.execute('What is 42 * 58?', {
conversationId: 'user-123',
sessionData: {},
history: [],
});
console.log(response.content);import {
Agent,
GeminiProvider,
OpenAIProvider,
AnthropicProvider,
OllamaProvider,
LMStudioProvider,
LocalAIProvider,
} from '@lov3kaizen/agentsea-core';
// Use Gemini
const geminiAgent = new Agent(
{ model: 'gemini-pro', provider: 'gemini' },
new GeminiProvider(process.env.GEMINI_API_KEY),
toolRegistry,
);
// Use OpenAI
const openaiAgent = new Agent(
{ model: 'gpt-4-turbo-preview', provider: 'openai' },
new OpenAIProvider(process.env.OPENAI_API_KEY),
toolRegistry,
);
// Use Anthropic
const claudeAgent = new Agent(
{ model: 'claude-sonnet-4-20250514', provider: 'anthropic' },
new AnthropicProvider(process.env.ANTHROPIC_API_KEY),
toolRegistry,
);
// Use Ollama (local)
const ollamaAgent = new Agent(
{ model: 'llama2', provider: 'ollama' },
new OllamaProvider(),
toolRegistry,
);
// Use LM Studio (local)
const lmstudioAgent = new Agent(
{ model: 'local-model', provider: 'openai-compatible' },
new LMStudioProvider(),
toolRegistry,
);
// Use LocalAI (local)
const localaiAgent = new Agent(
{ model: 'gpt-3.5-turbo', provider: 'openai-compatible' },
new LocalAIProvider(),
toolRegistry,
);Get compile-time validation for model-specific options. Inspired by TanStack AI:
import { anthropic, openai, createProvider } from '@lov3kaizen/agentsea-core';
// β
Valid: Claude 3.5 Sonnet supports tools, system prompts, and extended thinking
const claudeConfig = anthropic('claude-3-5-sonnet-20241022', {
tools: [myTool],
systemPrompt: 'You are a helpful assistant',
thinking: { type: 'enabled', budgetTokens: 10000 },
temperature: 0.7,
});
// β
Valid: o1 supports tools but NOT system prompts
const o1Config = openai('o1', {
tools: [myTool],
reasoningEffort: 'high',
// systemPrompt: '...' // β TypeScript error - o1 doesn't support system prompts
});
// β TypeScript error: o1-mini doesn't support tools
const o1MiniConfig = openai('o1-mini', {
// tools: [myTool], // Error: 'tools' does not exist in type
reasoningEffort: 'medium',
});
// Create type-safe providers
const provider = createProvider(claudeConfig);
console.log('Supports vision:', provider.supportsCapability('vision')); // trueKey Benefits:
- Zero runtime overhead - All validation at compile time
- IDE autocomplete - Only valid options appear per model
- Model capability registry - Query what each model supports
See full per-model type safety documentation β
Run AI models on your own hardware with complete privacy:
import { Agent, OllamaProvider } from '@lov3kaizen/agentsea-core';
// Create Ollama provider
const provider = new OllamaProvider({
baseUrl: 'http://localhost:11434',
});
// Pull a model (if not already available)
await provider.pullModel('llama2');
// List available models
const models = await provider.listModels();
console.log('Available models:', models);
// Create agent with local model
const agent = new Agent({
name: 'local-assistant',
description: 'AI assistant running locally',
model: 'llama2',
provider: 'ollama',
systemPrompt: 'You are a helpful assistant.',
});
agent.registerProvider('ollama', provider);
// Use the agent
const response = await agent.execute('Hello!', {
conversationId: 'conv-1',
sessionData: {},
history: [],
});Supported local providers:
- Ollama - Easy local LLM execution
- LM Studio - User-friendly GUI for local models
- LocalAI - OpenAI-compatible local API
- Text Generation WebUI - Feature-rich web interface
- vLLM - High-performance inference engine
- Any OpenAI-compatible endpoint
See full local models documentation β
Add voice interaction with Text-to-Speech and Speech-to-Text:
import {
Agent,
AnthropicProvider,
ToolRegistry,
VoiceAgent,
OpenAIWhisperProvider,
OpenAITTSProvider,
} from '@lov3kaizen/agentsea-core';
// Create base agent
const provider = new AnthropicProvider(process.env.ANTHROPIC_API_KEY);
const toolRegistry = new ToolRegistry();
const agent = new Agent(
{
name: 'voice-assistant',
model: 'claude-sonnet-4-20250514',
provider: 'anthropic',
systemPrompt: 'You are a helpful voice assistant.',
description: 'Voice assistant',
},
provider,
toolRegistry,
);
// Create voice agent with STT and TTS
const sttProvider = new OpenAIWhisperProvider(process.env.OPENAI_API_KEY);
const ttsProvider = new OpenAITTSProvider(process.env.OPENAI_API_KEY);
const voiceAgent = new VoiceAgent(agent, {
sttProvider,
ttsProvider,
ttsConfig: { voice: 'nova' },
});
// Process voice input
const result = await voiceAgent.processVoice(audioBuffer, context);
console.log('User said:', result.text);
console.log('Assistant response:', result.response.content);
// Save audio response
fs.writeFileSync('./response.mp3', result.audio!);Supported providers:
- STT: OpenAI Whisper, Local Whisper
- TTS: OpenAI TTS, ElevenLabs, Piper TTS
See full voice documentation β
import { MCPRegistry } from '@lov3kaizen/agentsea-core';
// Connect to MCP servers
const mcpRegistry = new MCPRegistry();
await mcpRegistry.addServer({
name: 'filesystem',
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '/tmp'],
transport: 'stdio',
});
// Get MCP tools (automatically converted)
const mcpTools = mcpRegistry.getTools();
// Use with agent
const agent = new Agent({ tools: mcpTools }, provider, toolRegistry);Add e-commerce capabilities to your agents with the Agentic Commerce Protocol:
import { ACPClient, createACPTools, Agent } from '@lov3kaizen/agentsea-core';
// Setup ACP client
const acpClient = new ACPClient({
baseUrl: 'https://api.yourcommerce.com/v1',
apiKey: process.env.ACP_API_KEY,
merchantId: process.env.ACP_MERCHANT_ID,
});
// Create commerce tools
const acpTools = createACPTools(acpClient);
// Create shopping agent
const shoppingAgent = new Agent(
{
name: 'shopping-assistant',
model: 'claude-sonnet-4-20250514',
provider: 'anthropic',
systemPrompt: 'You are a helpful shopping assistant.',
tools: acpTools, // Includes 14 commerce tools
},
provider,
toolRegistry,
);
// Start shopping
const response = await shoppingAgent.execute(
'I need wireless headphones under $100',
context,
);Available Commerce Operations:
- Product search and discovery
- Shopping cart management
- Checkout and payment processing
- Delegated payments (Stripe, PayPal, etc.)
- Order tracking and management
See full ACP documentation β
import { ConversationSchema } from '@lov3kaizen/agentsea-core';
import { z } from 'zod';
const schema = new ConversationSchema({
name: 'booking',
startStep: 'destination',
steps: [
{
id: 'destination',
prompt: 'Where would you like to go?',
schema: z.object({ city: z.string() }),
next: 'dates',
},
{
id: 'dates',
prompt: 'What dates?',
schema: z.object({
checkIn: z.string(),
checkOut: z.string(),
}),
next: 'confirm',
},
],
});# Install CLI globally
npm install -g @lov3kaizen/agentsea-cli
# Initialize configuration
sea init
# Start chatting
sea chat
# Run an agent
sea agent run default "What is the capital of France?"
# Manage models (Ollama)
sea model pull llama2
sea model listimport { Module } from '@nestjs/common';
import { AgenticModule } from '@lov3kaizen/agentsea-nestjs';
import { AnthropicProvider } from '@lov3kaizen/agentsea-core';
@Module({
imports: [
AgenticModule.forRoot({
provider: new AnthropicProvider(),
defaultConfig: {
model: 'claude-sonnet-4-20250514',
provider: 'anthropic',
},
enableRestApi: true, // Enable REST API endpoints
enableWebSocket: true, // Enable WebSocket gateway
}),
],
})
export class AppModule {}REST API Endpoints:
GET /agents- List all agentsGET /agents/:name- Get agent detailsPOST /agents/:name/execute- Execute agentPOST /agents/:name/stream- Stream agent response (SSE)
WebSocket Events:
execute- Execute an agentstream- Real-time streaming eventslistAgents- Get available agentsgetAgent- Get agent info
- @lov3kaizen/agentsea-core - Framework-agnostic core library
- @lov3kaizen/agentsea-types - Shared TypeScript type definitions
- @lov3kaizen/agentsea-nestjs - NestJS integration with decorators
- @lov3kaizen/agentsea-cli - Command-line interface
- @lov3kaizen/agentsea-crews - Multi-agent orchestration with role-based coordination
- @lov3kaizen/agentsea-gateway - High-performance LLM gateway with routing, caching, and cost optimization
- @lov3kaizen/agentsea-memory - Advanced memory with semantic retrieval and multi-agent support
- @lov3kaizen/agentsea-embeddings - Embedding providers with caching and quality metrics
- @lov3kaizen/agentsea-cache - Intelligent caching strategies for LLM responses
- @lov3kaizen/agentsea-structured - TypeScript-native structured output with Zod schema enforcement
- @lov3kaizen/agentsea-ingest - Document ingestion pipeline with parsers and chunkers
- @lov3kaizen/agentsea-prompts - Prompt management and templating
- @lov3kaizen/agentsea-guardrails - Content safety, prompt injection detection, and validation
- @lov3kaizen/agentsea-evaluate - LLM evaluation, human feedback, and continuous monitoring
- @lov3kaizen/agentsea-redteam - Red teaming and adversarial testing for AI systems
- @lov3kaizen/agentsea-analytics - Usage analytics and insights
- @lov3kaizen/agentsea-costs - Cost tracking and optimization
- @lov3kaizen/agentsea-debugger - Debugging and tracing tools
- @lov3kaizen/agentsea-surf - Browser automation for web agents (Puppeteer/Playwright)
- @lov3kaizen/agentsea-react - React components for agent interfaces
- @lov3kaizen/agentsea-admin-ui - Admin dashboard for monitoring agents
- examples - Example applications
AgentSea follows a clean, layered architecture:
βββββββββββββββββββββββββββββββββββββββββββ
β Application Layer β
β (Your NestJS/Node.js Application) β
βββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββ
β AgentSea ADK Layer β
β βββββββββββββββββββββββββββββββββββ β
β β Multi-Agent Orchestration β β
β βββββββββββββββββββββββββββββββββββ β
β βββββββββββββββββββββββββββββββββββ β
β β Conversation Management β β
β βββββββββββββββββββββββββββββββββββ β
β βββββββββββββββββββββββββββββββββββ β
β β Agent Runtime & Tools β β
β βββββββββββββββββββββββββββββββββββ β
β βββββββββββββββββββββββββββββββββββ β
β β Multi-Provider Adapters β β
β β (Claude, GPT, Gemini, MCP) β β
β βββββββββββββββββββββββββββββββββββ β
β βββββββββββββββββββββββββββββββββββ β
β β Observability & Utils β β
β βββββββββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββ
β Infrastructure Layer β
β (LLM APIs, Storage, Monitoring) β
βββββββββββββββββββββββββββββββββββββββββββ
Autonomous AI entities that can reason, use tools, and maintain conversation context.
Multi-agent teams with defined roles, delegation strategies, and coordinated task execution.
Functions that agents can call to perform specific tasks (API calls, calculations, etc.).
Hierarchical memory system with episodic, semantic, and working memory structures. Supports multi-agent sharing with access control.
Input validation, output filtering, and safety checks to ensure responsible AI behavior.
LLM-as-Judge, human feedback collection, and continuous monitoring for quality assurance.
OpenAI-compatible API gateway with intelligent routing, load balancing, and fallback handling.
Model Context Protocol integration for seamless tool and resource integration.
Define structured conversation flows with validation and dynamic routing.
Full documentation available at agentsea.dev
- Structured - Zod Schema Enforcement
- Guardrails - Safety & Validation
- Gateway - LLM Gateway
- Ingest - Document Ingestion
- Crews - Multi-Agent Orchestration
- Memory - Advanced Memory Systems
- Embeddings - Embedding Providers
- Evaluate - LLM Evaluation
- Surf - Browser Automation
- React - UI Components
- MCP Integration
- Local Models & Open Source
- Voice Features (TTS/STT)
- Provider Reference
- NestJS Integration
# Install dependencies
pnpm install
# Build all packages
pnpm build
# Run tests
pnpm test
# Run tests with coverage
pnpm test:cov
# Development mode (watch)
pnpm dev
# Lint
pnpm lint
# Type check
pnpm type-check- Multi-provider support (Claude, GPT, Gemini)
- Local & open source model support (Ollama, LM Studio, LocalAI, etc.)
- Voice support (TTS/STT) with multiple providers
- Command-line interface (CLI)
- MCP protocol integration
- Multi-agent crews with role-based coordination
- Delegation strategies (round-robin, best-match, auction, hierarchical)
- Conversation schema system
- Advanced memory stores (Buffer, Redis, PostgreSQL, SQLite, Pinecone)
- Memory structures (Episodic, Semantic, Working)
- Multi-agent memory sharing with access control
- LLM Gateway with OpenAI-compatible API, caching, and cost optimization
- Intelligent routing (round-robin, least-latency, cost-based)
- Structured output with Zod schema enforcement
- Document ingestion pipeline with parsers and chunkers
- Guardrails for content safety, prompt injection, and PII detection
- Content filtering and validation
- LLM evaluation metrics and LLM-as-Judge
- Human feedback collection and preference learning
- Continuous evaluation monitoring
- Red teaming and adversarial testing
- Embeddings with multi-provider support
- Browser automation (Playwright, Puppeteer, native)
- Built-in tools (8 tools + custom support)
- Observability (logging, metrics, tracing)
- Cost tracking and analytics
- NestJS integration for all packages
- React components for agent interfaces
- Rate limiting and caching
- Comprehensive test suite
- TypeScript definitions with strict type safety
- CI/CD workflows with automated releases
- Admin UI dashboard improvements
- Additional MCP tools/servers
- Enhanced computer use agent capabilities
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
- π¬ Discussions - Ask questions and share ideas
- π Issues - Report bugs and request features
- π Documentation - Read the full documentation
MIT License - see LICENSE for details
Built with β€οΈ by lovekaizen
Special thanks to:
Website β’ Documentation β’ Examples β’ API Reference
Made with TypeScript and AI π€