____ _ _ _ _
| _ \(_)___| | ____| | / \ (_)
| | | | / __| |/ / _` | / _ \ | |
| |_| | \__ \ < (_| | / ___ \ | |
|____/|_|___/_|\_\__,_| /_/ \_\|_|
THE MISSING DRIVE FOR AI
Persistent storage infrastructure for AI agents and LLMs.
LLMs are stateless. They forget everything between sessions. Your agent artifacts vanish. Your workflows disappear. You're stuck copy-pasting documents into every chat.
diskd.ai fixes this. Think of it as the D: drive for your AI - the persistent layer where knowledge survives model crashes and context resets.
| Feature | Description |
|---|---|
| MCP Server | Model Context Protocol server for Claude, GPT, and any MCP-compatible agent |
| Unix-Style Tools | ls, glob, grep, cat, vsearch, biquery - pipe-friendly and composable |
| Agent-Native CLI | Works with Claude Code, Cursor, Cline - no SDK required |
| Auto-Indexing | Automatic vector embeddings + full-text indexing on upload |
| 3-Way Hybrid Search | Semantic (vsearch) + keyword (grep) + BI queries (biquery) |
| 20+ File Processors | PDF, DOCX, XLSX, images, audio, video, YouTube, GitHub repos |
| DriveDB | Create queryable SQL databases directly in the drive |
# Upload your files
diskd upload ./contracts/*.pdf
# Start MCP server
diskd mcp serve --port 3001
# Query from any agent
diskd vsearch "payment terms" /contracts
diskd grep "NET 30" /contracts
diskd biquery "SELECT * FROM data WHERE amount > 1000"- Model-Agnostic - Works with Claude, GPT, Llama, or any LLM
- Multi-Agent Collaboration - Multiple agents share projects without conflicts
- S3-Native - AWS, MinIO, OVH, Backblaze - any S3-compatible storage
- On-Prem or Cloud - Docker Compose, Kubernetes, or one-click AWS deploy
- Zero Vendor Lock-in - Your data stays yours
| Metric | Value |
|---|---|
| Storage cost | $0.023/GB/month |
| Search latency | 10ms p99 |
| Ingest rate | 100K docs/second |
| Vendor lock-in | 0 |
Like Dropbox, but for your AI agents.