AI-powered CLI: explain code, generate shell commands, use any LLM provider.
Built in Rust. Works on macOS, Linux, and Windows.
$ niko cmd "find all files larger than 100MB"
find . -type f -size +100M
Copied to clipboard
$ cat main.rs | niko explain
π 42 lines analyzed β completed in 2.1s
## Overview
...- Three Modes β
cmd,explain,settings - Dynamic LLM Providers β Any OpenAI-compatible API, Claude, or local Ollama
- Dynamic Model Selection β Fetches available models from the API, no hardcoded lists
- RAM-Based Restrictions β Prevents selecting models too large for your hardware
- Auto-Install Ollama β Installs Ollama automatically if not present
- Smart Code Chunking β Splits large files at function boundaries with context memory between chunks
- Automatic Retry β Exponential backoff for transient failures (timeouts, rate limits, 5xx errors)
- Connection Pooling β Keep-alive HTTP connections for fast sequential LLM calls
- Command Generation β Natural language β shell commands, auto-copied to clipboard
- Safety Warnings β Flags dangerous commands before execution
- Cross-Platform β macOS, Linux (Ubuntu/Debian/etc.), Windows
curl -fsSL https://raw.githubusercontent.com/rgcsekaraa/niko-cli/main/install.sh | shiwr -useb https://raw.githubusercontent.com/rgcsekaraa/niko-cli/main/install.ps1 | iex# Install latest version from git
cargo install --git https://github.com/rgcsekaraa/niko-cli
# Or install from local source
cargo install --path .# First run β interactive setup wizard
niko settings configureThis will:
- Show available providers (Ollama, OpenAI, Claude, DeepSeek, Grok, Groq, Mistral, Together, OpenRouter, or custom)
- For Ollama: auto-install if needed β list local models β show downloadable models filtered by your RAM β let you pick
- For API providers: ask for API key β fetch available models from the API β let you pick
- Save everything to
~/.niko/config.yaml
$ niko cmd "find python files modified today"
find . -name "*.py" -mtime 0
Copied to clipboard
$ niko cmd "kill process on port 3000"
$ niko cmd "compress logs folder to tar.gz"
$ niko cmd "git commits from last week"
$ niko cmd "show disk usage by directory"# From a file
niko explain -f src/main.rs
# Pipe code in
cat complex_module.py | niko explain
# Paste interactively (live line counter, Ctrl-D or two empty lines to finish)
niko explainFor large files, Niko:
- Chunks code at function/block boundaries (max 200 lines/chunk)
- Carries context β each chunk includes overlapping lines and a running summary from previous chunks
- Retries failed LLM calls with exponential backoff (3 attempts, 500ms β 4s delay)
- Synthesises chunk analyses into an overall summary with follow-up questions
# Interactive setup wizard
niko settings configure
# Show current config
niko settings show
# Set a value directly
niko settings set openai.api_key sk-xxx
niko settings set openai.model gpt-4o
niko settings set active_provider openai
# Reset to defaults
niko settings init
# Print config path
niko settings pathniko cmd "list files" --provider openai
niko explain -f main.rs --provider claudeNiko is designed for production use with reliability and speed:
| Feature | Details |
|---|---|
| Streaming | Tokens appear immediately as the LLM generates them (all providers) |
| Retry | 3 attempts with exponential backoff (500ms β 2s + jitter) |
| Retryable errors | Timeouts, connection resets, 429/5xx, rate limits, model loading |
| Connection pooling | HTTP keep-alive, 4 idle connections/host, TCP keepalive 30s |
| Model keep-alive | Ollama keeps model in VRAM for 30 min (no reload between calls) |
| Flash attention | Enabled by default for Ollama (faster on Apple Silicon / GPU) |
| Adaptive tokens | cmd mode uses 512 max tokens, explain uses 4096 β less KV cache for short tasks |
| Adaptive context | Ollama context window scales with prompt size (4K β 16K) |
| Empty response guard | Detects and retries empty/null LLM responses |
| Truncation detection | Warns when response hits max_tokens (Claude, OpenAI) |
| Context memory | Multi-chunk explanations carry 10-line code overlap for boundary continuity |
| Structured errors | Parses API error responses for clear, actionable messages |
| Provider | Type | How to set up |
|---|---|---|
| Ollama | Local (free) | Auto-installed, models downloaded on demand |
| OpenAI | API | niko settings configure β select OpenAI β enter key |
| Claude | API | niko settings configure β select Claude β enter key |
| DeepSeek | API | niko settings configure β select DeepSeek β enter key |
| Grok | API | niko settings configure β select Grok β enter key |
| Groq | API | niko settings configure β select Groq β enter key |
| Mistral | API | niko settings configure β select Mistral β enter key |
| Together | API | niko settings configure β select Together β enter key |
| OpenRouter | API | niko settings configure β select OpenRouter β enter key |
| Custom | API | niko settings configure β choose "Custom" β enter URL + key |
All API providers fetch models dynamically from their /models endpoint β nothing is hardcoded.
API keys can also be set via environment variables:
export OPENAI_API_KEY=sk-xxx
export ANTHROPIC_API_KEY=sk-ant-xxx
export DEEPSEEK_API_KEY=xxx
export GROK_API_KEY=xxx
export GROQ_API_KEY=xxx
export TOGETHER_API_KEY=xxx
export MISTRAL_API_KEY=xxx
export OPENROUTER_API_KEY=xxxFor local models (Ollama), Niko estimates the maximum model size your system can handle:
| System RAM | Max Model Size |
|---|---|
| 8 GB | ~4B parameters |
| 16 GB | ~12B parameters |
| 32 GB | ~28B parameters |
| 64 GB | ~60B parameters |
Models exceeding your RAM limit are hidden from the selection list. You can still force-select them with a confirmation prompt.
All settings are stored in ~/.niko/config.yaml. The file uses a dynamic structure β providers are a map, so you can add as many as you want:
active_provider: openai
providers:
ollama:
kind: ollama
base_url: http://127.0.0.1:11434
model: qwen2.5-coder:7b
openai:
kind: openai_compat
api_key: sk-xxx
base_url: https://api.openai.com/v1
model: gpt-4o
claude:
kind: anthropic
api_key: sk-ant-xxx
model: claude-sonnet-4-20250514rm $(which niko)
rm -rf ~/.nikoMIT