A terminal-based AI coding assistant that actually works with your files and shell.
Deputy gives you an AI assistant that can:
- Read and write files in your project
- Run shell commands
- Navigate your codebase intelligently
- Remember what you've approved it to do
No copying and pasting code snippets. No switching between terminal and browser. Just tell it what you want and it gets on with it.
cargo install deputySet your API key:
export ANTHROPIC_API_KEY=your_key_here
# or
export OPENAI_API_KEY=your_key_herecd your-project
deputyThat's it. Deputy will scan your project and you can start chatting.
deputy --provider open-ai --model gpt-4o # Use OpenAI instead
deputy --yolo # Skip permission prompts
deputy --base-url http://localhost:8080/v1 # Custom API endpoint
deputy --config ./my-config.md # Use custom configuration file
# ollama, you need to set OPENAI_API_KEY to some fake value (not an empty string)
deputy --provider open-ai --base-url http://localhost:11434/v1 --model gpt-oss:20b
deputy --provider ollama # this respects the OLLAMA_HOST env var and doesn't need OPENAI_API_KEY setDeputy asks before doing potentially destructive things. You can:
- Approve once
- Remember your choice for similar operations
- Use
--yolomode to skip prompts entirely
You can specify a custom configuration file using the --config option:
deputy --config ./path/to/my-config.mdWhen using --config, Deputy will read ONLY that file and ignore the default search locations.
If no custom config is specified, Deputy loads configuration files in priority order (first found wins):
DEPUTY.mdin your project root~/.deputy/DEPUTY.mdfor global configAGENTS.mdin your project rootCLAUDE.mdin your project root~/.claude/CLAUDE.mdfor global config
These files contain instructions that Deputy will follow during your session.
Issues and PRs welcome.
MIT


