Multi-provider framework in Elixir
-
Updated
Jul 9, 2025 - Elixir
Multi-provider framework in Elixir
AI Gateway: Claude Pro, Copilot, Gemini subscriptions → OpenAI/Anthropic/Gemini APIs. No API keys needed.
Terraform module to provision a VPC peering across multiple VPCs in different accounts by using multiple providers
A powerful, AI Gateway designed from scratch for AI
Production-ready Python library for multi-provider LLM orchestration
Multi cloud control of VM Instances across AWS, Azure, GCP and AliCloud - unified instance management
Easy to use Multi-Provider ASR/Speech To Text and NLP engine
Versatile Python Telegram chatbot providing a unified interface to multiple LLMs (OpenAI, Gemini, Claude, Grok, Qwen, Mistral, Deepseek) & Image Generation models (DALL-E, Gemini, Grok). Features context memory, public/private modes, easy Docker deployment (amd64/arm64), and low resource usage.
Automate DNS updates and rollbacks across multiple providers using DNSControl and GitHub Actions
Python for logic. English for intelligence.
🚀 Intelligent Claude Code status line with multi-provider AI support, real-time token counting, and universal model compatibility. Supports Claude (Sonnet 4: 1M, 3.5: 200K), OpenAI (GPT-4.1: 1M, 4o: 128K), Gemini (1.5 Pro: 2M, 2.x: 1M), and xAI Grok (3: 1M, 4: 256K) with verified 2025 context limits.
One API, every AI model, instant switching. Change from GPT-4 to Gemini to local models with a single config update. LLMForge is the lightweight, TypeScript-first solution for multi-provider AI applications with zero vendor lock-in
Your Universal AI Coding Agent
Qurio is a fast, polished LLM workspace for multi-provider setups (Gemini, SiliconFlow, OpenAI-compatible and more to come). Manage your threads and knowledge like a master
Simple file management via a provider like S3
OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.
A flexible agent framework for building AI agents with MCP (Model Context Protocol) integration, Core abstractions for LLM and Embedding models using mcp architecture. to specifically make AI agents easier to build.
Streaming-first multi-provider LLM client in TypeScript with home-made tool calling
Add a description, image, and links to the multi-provider topic page so that developers can more easily learn about it.
To associate your repository with the multi-provider topic, visit your repo's landing page and select "manage topics."