Works with any models, BYOK, written in Rust π
cowoker-demo.mp4
Use your own API keys or even bring your own local models for ultimate privacy control.
Agent fully written in Rust with zero external dependencies - blazingly fast and memory-safe.
True native performance on macOS, Windows, and Linux.
Uses Docker containers for secure command execution and complete isolation.
Support for custom skills to extend agent capabilities. Default skills are: docx, pdf, pptx, xlsx.
Full support for Model Context Protocol (MCP) for seamless tool integration.
- π Local & Private: Runs entirely on your machine, API calls go directly to your chosen provider
- π BYOK Support: Use your own Anthropic, OpenAI, or local model APIs
- π― Model Agnostic: Works with Claude, GPT, local models, and more
- π₯οΈ Cross-Platform: macOS (ARM & Intel), Windows, and Linux
- πͺΆ Lightweight: ~10MB app size using Tauri
- π³ Containerized: Docker isolation for enhanced security
- π§© Skills: Extensible skill system for custom capabilities
- π MCP: Model Context Protocol support for tool integration
This is still an early project and please be super careful when connecting with your local folders.
Get up and running in minutes:
Will update to a clean release build soon.
- Open Settings (gear icon in sidebar)
- Choose your AI provider:
- Anthropic Claude - Enter your Claude API key
- OpenAI GPT - Enter your OpenAI API key
- Local Models - Configure Ollama/LM Studio endpoint
- Select your preferred model (Claude 3.5 Sonnet, GPT-4, etc.)
- Add your API key in the settings
- Keys are stored locally and never shared
- Click "Select Project Path" when creating a new task
- Choose your project folder or workspace directory
- The agent will work within this folder context
- Click "New Task"
- Describe what you want to accomplish
- Watch the AI agent work on your project
- Review the plan and implementation steps
Example tasks:
- "Organize my folders"
- "Read all the receipts and make an expense reports"
- "Summarize the meeting notes and give me all the TODOs."
- Node.js 18+
- Rust (for Tauri)
- Docker Desktop (required for container isolation)
- Tauri Prerequisites
Note: Docker Desktop must be installed and running for container isolation features. Without Docker, the app will still work but commands will run without isolation.
# Clone the repo
git clone https://github.com/kuse-ai/kuse-cowork.git
cd kuse-cowork
# Install dependencies
npm install
# Run in development mode
npm run tauri dev
# Build for production
npm run tauri buildkuse-cowork/
βββ src/ # Frontend (SolidJS + TypeScript)
β βββ components/ # UI components
β βββ lib/ # Utilities (API clients, MCP)
β βββ stores/ # State management
βββ src-tauri/ # Backend (Rust + Tauri)
β βββ src/ # Rust source code
β β βββ agent/ # Agent implementation
β β βββ tools/ # Built-in tools
β β βββ skills/ # Skills system
β β βββ mcp/ # MCP protocol support
β β βββ database.rs # Local data storage
β βββ Cargo.toml # Rust dependencies
β βββ tauri.conf.json # Tauri configuration
βββ .github/workflows/ # CI/CD for cross-platform builds
βββ docs/ # Documentation and assets
Kuse Cowork supports multiple AI providers:
- Anthropic Claude: Direct API integration
- OpenAI GPT: Full GPT model support
- Local Models: Ollama, LM Studio, or any OpenAI-compatible endpoint
- Custom APIs: Configure any compatible endpoint
All settings are stored locally and never shared:
- API Configuration: Keys and endpoints for your chosen provider
- Model Selection: Choose from available models
- Agent Behavior: Temperature, max tokens, system prompts
- Security: Container isolation settings
- Skills: Enable/disable custom skills
- MCP Servers: Configure external tool providers
Kuse Cowork uses Docker containers to isolate all external command execution:
- Complete isolation from your host system
- Secure networking with controlled access
- Resource limits to prevent abuse
- Clean environments for each execution
- No telemetry - nothing is sent to our servers
- Local storage - all data stays on your machine
- Direct API calls - communications only with your chosen AI provider
- Open source - full transparency of all code
MIT License - see LICENSE for details.
- π¦ Streamlined Release Pipeline - Automated builds and easier distribution
- π― Simplified Setup - One-click installation for non-developers
- π¬ Lightweight Sandbox - Migrate to an lightweight sandbox.
- π§ Context Engineering - Enhanced support for better context management
- π§ Auto-configuration - Intelligent setup for common development environments
- π± Mobile Support - Cross-platform mobile app support
- Docker Desktop required for full isolation features
- Manual setup process for development environment
Inspired by:
- Claude Cowork - The original inspiration
β Star this repo if you find it useful!
