Skip to content
#

multi-provider

Here are 41 public repositories matching this topic...

Versatile Python Telegram chatbot providing a unified interface to multiple LLMs (OpenAI, Gemini, Claude, Grok, Qwen, Mistral, Deepseek) & Image Generation models (DALL-E, Gemini, Grok). Features context memory, public/private modes, easy Docker deployment (amd64/arm64), and low resource usage.

  • Updated Dec 27, 2025
  • Python

🚀 Intelligent Claude Code status line with multi-provider AI support, real-time token counting, and universal model compatibility. Supports Claude (Sonnet 4: 1M, 3.5: 200K), OpenAI (GPT-4.1: 1M, 4o: 128K), Gemini (1.5 Pro: 2M, 2.x: 1M), and xAI Grok (3: 1M, 4: 256K) with verified 2025 context limits.

  • Updated Sep 17, 2025
  • Shell
AI-Worker-Proxy

OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.

  • Updated Dec 24, 2025
  • TypeScript

Improve this page

Add a description, image, and links to the multi-provider topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the multi-provider topic, visit your repo's landing page and select "manage topics."

Learn more