A lightweight AI agent framework inspired by openclaw & nanobot, implemented with TypeScript + Node.js
- Lightweight: Minimal dependencies, fast startup
- TypeScript: Full type safety
- Modular: Easy to extend and customize
- Session Management: Built-in session handling
- Tool Integration: Simple tool registry system
- WebSocket Support: Real-time communication
- Multi-Model Support: Support for Ollama, MiniMax, and OpenAI-compatible APIs
- Memory Persistence: Context-aware conversations with persistent memory
- Feishu Integration: Real-time communication with Feishu (Lark) messaging platform using official SDK
- Node.js >= 20.0.0
- For local AI models: Ollama (optional, for local model support)
# Using pnpm
pnpm add microbot
# Using npm
npm install microbot
# Using yarn
yarn add microbotTo use the microbot CLI command globally:
npm install -g .
# or
pnpm link --global# Start MicroBot
microbot start
# Check status
microbot status
# Help
microbot --helpCopy the default configuration:
cp .env.example .envMicroBot supports multiple AI model providers through a unified interface:
# Model Type: ollama, minimax, or openai
MODEL_TYPE=ollama
# For Ollama (Local Models)
OLLAMA_HOST=localhost
OLLAMA_PORT=11434
OLLAMA_PROTOCOL=http
OLLAMA_MODEL=qwen3-vl:8b
# For MiniMax (Cloud Models)
MODEL_NAME=MiniMax-M2.1
MODEL_API_KEY=your_minimax_api_key
MODEL_BASE_URL=https://api.minimaxi.com/v1
# For OpenAI-Compatible APIs
MODEL_NAME=gpt-4
MODEL_API_KEY=your_openai_api_key
MODEL_BASE_URL=https://api.openai.com/v1| Model Type | Description | Required Config |
|---|---|---|
ollama |
Local models via Ollama | OLLAMA_HOST, OLLAMA_PORT, OLLAMA_MODEL |
minimax |
MiniMax cloud models | MODEL_NAME, MODEL_API_KEY, MODEL_BASE_URL |
openai |
OpenAI-compatible APIs | MODEL_NAME, MODEL_API_KEY, MODEL_BASE_URL |
- Feishu Developer Account: You need to be a Feishu (Lark) developer
- Feishu App: Create an app in the Feishu Developer Console
- Internet Access: Your server needs outbound internet access to connect to Feishu WebSocket
-
Create Feishu App
- Go to Feishu Developer Console
- Create a new app ("From Scratch" or "Enterprise Self-built App")
- Get your
App IDandApp Secretfrom the "Credentials" section
-
Configure App Features
- Enable "Bot" feature in the app settings
- Enable "Event Subscriptions" for message events
- Add the required scopes:
im:message,im:message:read,im:message:send
-
Update Environment Variables
- Add these variables to your
.envfile:
FEISHU_APP_ID=your_app_id FEISHU_APP_SECRET=your_app_secret
- Add these variables to your
-
Start MicroBot
- Run MicroBot with Feishu integration:
microbot start
- The Feishu WebSocket connection will be established automatically using the official SDK
-
Test the Integration
- Add the bot to a Feishu group or chat
- Send a message to the bot
- The bot should respond using the configured AI model
- SDK: Uses
@larksuiteoapi/node-sdkofficial SDK - Connection Type: Client-initiated WebSocket connection
- Authentication: Uses Feishu API access tokens
- Heartbeat: Automatic heartbeat to maintain connection
- Reconnection: Automatic reconnection on failure
- Supported Events:
im.message.receive_v1(message events) - Message Processing: All Feishu messages are automatically processed by the same agent logic
- WebSocket Connection Failed: Check your App ID and App Secret
- No Response from Bot: Verify your AI model is configured correctly and accessible
- Access Token Error: Check if your app has the correct scopes
- Network Issues: Ensure your server has outbound internet access
import { MicroBot } from 'microbot';
// Create a bot instance
const bot = new MicroBot({
name: 'MyBot',
version: '1.0.0'
});
// Register custom tools
bot.registerTool('greet', {
description: 'Greet the user',
execute: (args) => {
return `Hello, ${args.name}!`;
}
});
// Start the bot
await bot.start();import { ModelFactory } from 'microbot';
// Create an Ollama client
const ollamaClient = ModelFactory.createClient({
type: 'ollama',
host: 'localhost',
port: 11434,
model: 'qwen3-vl:8b'
});
// Create a MiniMax client
const minimaxClient = ModelFactory.createClient({
type: 'minimax',
model: 'abab6.5-chat',
apiKey: 'your_api_key',
baseUrl: 'https://api.minimaxi.com/v1'
});
// Create an OpenAI-compatible client
const openaiClient = ModelFactory.createClient({
type: 'openai',
model: 'gpt-4',
apiKey: 'your_api_key',
baseUrl: 'https://api.openai.com/v1'
});
// Use the client
const response = await ollamaClient.chat({
messages: [{ role: 'user', content: 'Hello!' }]
});microbot/
βββ src/ # Source code
β βββ agent/ # AI agent core
β β βββ tools/ # Tool registry
β β βββ context.ts # Execution context
β β βββ loop.ts # Agent loop with model integration
β β βββ memory.ts # Memory management
β β βββ skills.ts # Agent skills
β βββ api/ # API clients
β β βββ model.ts # Model interface definition
β β βββ model-factory.ts # Model factory for creating clients
β β βββ ollama-adapter.ts # Ollama client adapter
β β βββ minimax.ts # MiniMax API client
β β βββ openai-compatible.ts # OpenAI-compatible API client
β β βββ feishu.ts # Feishu (Lark) integration with SDK
β β βββ websocket.ts # WebSocket server
β βββ session/ # Session management
β β βββ manager.ts # Session manager
β βββ utils/ # Utilities
β β βββ logger.ts # Logger
β βββ index.ts # Main entry
βββ dist/ # Build output
βββ sessions/ # Session storage
βββ .env # Environment configuration
βββ microbot.mjs # CLI entry
βββ package.json # Project config
βββ tsconfig.json # TypeScript config
βββ README.md # This file
# Clone the repository
git clone <repository-url>
cd microbot
# Install dependencies
pnpm install# Development mode
pnpm dev
# Build for production
pnpm build
# Start production build
pnpm start
# Run linting
pnpm lint
# Format code
pnpm format
# Run tests
pnpm test- Agent: The main AI agent that processes requests using configurable AI models
- Session: Manages conversation state and history
- Tool: Functions that the agent can call
- Memory: Stores and retrieves information
- Skill: Pre-defined capabilities of the agent
- Model Client: Unified interface for communicating with different AI providers
class MicroBot {
constructor(options: MicroBotOptions);
registerTool(name: string, tool: Tool);
start(): Promise<void>;
stop(): Promise<void>;
}interface Tool {
description: string;
execute: (args: Record<string, any>) => Promise<any> | any;
}interface ModelClient {
chat(request: ModelRequest): Promise<ModelResponse>;
stream(request: ModelRequest): AsyncIterable<StreamChunk>;
}interface ModelRequest {
messages: Message[];
model?: string;
stream?: boolean;
options?: {
temperature?: number;
max_tokens?: number;
};
}interface ModelResponse {
content: string;
model: string;
usage?: {
prompt_tokens: number;
completion_tokens: number;
total_tokens: number;
};
}The framework provides a unified interface for multiple AI providers:
- Ollama: Local model inference with support for various models
- MiniMax: Cloud-based AI models with high performance
- OpenAI-Compatible: Any API that follows OpenAI's format
All model clients implement the same ModelClient interface, making it easy to switch between providers.
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create a new branch (
git checkout -b feature/AmazingFeature) - Make your changes
- Run linting and tests (
pnpm lint && pnpm test) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Based on nanobot architecture
- Built with TypeScript and Node.js
- Powered by Ollama, MiniMax, and OpenAI-compatible APIs for AI inference
- Feishu SDK integration for seamless messaging
- Open-source technologies
Happy Bot Building! π€