Bidirectional LLM API Adapter - A unified infrastructure for converting between different LLM provider APIs
- 🔄 Bidirectional Conversion: Convert between any LLM provider API formats
- 🎯 Type-Safe: Full TypeScript support with comprehensive type definitions
- 🔌 Extensible: Easy to add custom adapters for new providers
- ⚡ Zero Dependencies: Core package has zero runtime dependencies
- 🧪 Well-Tested: High test coverage with comprehensive test suites
- 📦 Tree-Shakable: Optimized for modern bundlers
- 🚀 8 Official Adapters: OpenAI, Anthropic, DeepSeek, Moonshot, Zhipu, Qwen, Gemini, MiniMax
# Install core package and adapters you need
pnpm add @amux.ai/llm-bridge @amux.ai/adapter-openai @amux.ai/adapter-anthropicimport { createBridge } from '@amux.ai/llm-bridge'
import { openaiAdapter } from '@amux.ai/adapter-openai'
import { anthropicAdapter } from '@amux.ai/adapter-anthropic'
// Create a bridge: OpenAI format in → Anthropic API out
const bridge = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter,
config: {
apiKey: process.env.ANTHROPIC_API_KEY,
baseURL: 'https://api.anthropic.com'
}
})
// Send OpenAI-format request, get OpenAI-format response
// But actually calls Claude API under the hood
const response = await bridge.chat({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }]
})
console.log(response.choices[0].message.content)| Package | Description | Version | Status |
|---|---|---|---|
| @amux.ai/llm-bridge | Core IR and adapter interfaces | - | ✅ Stable |
| @amux.ai/adapter-openai | OpenAI adapter | - | ✅ Stable |
| @amux.ai/adapter-anthropic | Anthropic (Claude) adapter | - | ✅ Stable |
| @amux.ai/adapter-deepseek | DeepSeek adapter | - | ✅ Stable |
| @amux.ai/adapter-moonshot | Moonshot (Kimi) adapter | - | ✅ Stable |
| @amux.ai/adapter-zhipu | Zhipu AI (GLM) adapter | - | ✅ Stable |
| @amux.ai/adapter-qwen | Qwen adapter | - | ✅ Stable |
| @amux.ai/adapter-google | Google Gemini adapter | - | ✅ Stable |
| @amux.ai/adapter-minimax | MiniMax adapter | - | ✅ Stable |
| @amux.ai/utils | Shared utilities | - | ✅ Stable |
┌─────────────────────────────────────────────────────────┐
│ Your Application │
└────────────────────┬────────────────────────────────────┘
│ OpenAI Format Request
▼
┌─────────────────────────────────────────────────────────┐
│ Inbound Adapter │
│ (Parse OpenAI → IR) │
└────────────────────┬────────────────────────────────────┘
│ Intermediate Representation (IR)
▼
┌─────────────────────────────────────────────────────────┐
│ Bridge │
│ (Validation & Compatibility Check) │
└────────────────────┬────────────────────────────────────┘
│ IR
▼
┌─────────────────────────────────────────────────────────┐
│ Outbound Adapter │
│ (IR → Build Anthropic) │
└────────────────────┬────────────────────────────────────┘
│ Anthropic Format Request
▼
┌─────────────────────────────────────────────────────────┐
│ Anthropic API │
└─────────────────────────────────────────────────────────┘
- Multi-Provider Support: Build applications that work with multiple LLM providers
- Provider Migration: Easily migrate from one provider to another
- Cost Optimization: Route requests to different providers based on cost/performance
- Fallback Strategy: Implement automatic fallback to alternative providers
- Testing: Test your application with different providers without code changes
import { createBridge } from '@amux.ai/llm-bridge'
import { openaiAdapter } from '@amux.ai/adapter-openai'
import { anthropicAdapter } from '@amux.ai/adapter-anthropic'
import { deepseekAdapter } from '@amux.ai/adapter-deepseek'
import { moonshotAdapter } from '@amux.ai/adapter-moonshot'
import { qwenAdapter } from '@amux.ai/adapter-qwen'
import { geminiAdapter } from '@amux.ai/adapter-google'
// OpenAI → Anthropic
const bridge1 = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter,
config: { apiKey: process.env.ANTHROPIC_API_KEY }
})
// Anthropic → DeepSeek
const bridge2 = createBridge({
inbound: anthropicAdapter,
outbound: deepseekAdapter,
config: { apiKey: process.env.DEEPSEEK_API_KEY }
})
// Any combination works!const bridge = createBridge({
inbound: openaiAdapter,
outbound: anthropicAdapter,
config: { apiKey: process.env.ANTHROPIC_API_KEY }
})
for await (const chunk of bridge.chatStream({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Tell me a story' }],
stream: true
})) {
console.log(chunk)
}const response = await bridge.chat({
model: 'gpt-4',
messages: [{ role: 'user', content: 'What is the weather in SF?' }],
tools: [{
type: 'function',
function: {
name: 'get_weather',
description: 'Get the current weather',
parameters: {
type: 'object',
properties: {
location: { type: 'string' }
},
required: ['location']
}
}
}]
})# Run all tests
pnpm test
# Run tests for specific package
cd packages/llm-bridge && pnpm test
# Run tests with coverage
pnpm test:coverage# Install dependencies
pnpm install
# Build all packages
pnpm build
# Run example
cd examples/basic && pnpm start
# Type check
pnpm typecheck
# Lint
pnpm lintFor publishing npm packages, use the manual publish workflow:
# 1. Add changeset (describe your changes)
pnpm changeset
# 2. Update versions and generate CHANGELOG
pnpm changeset:version
# 3. Commit and push version updates
git add .
git commit -m "chore: bump package versions"
git push
# 4. Build packages
pnpm --filter "./packages/**" build
# 5. Publish to npm (requires npm login)
pnpm changeset:publish
# 6. Push generated tags
git push --tagsFor releasing the Desktop application:
# Use the release script (recommended)
pnpm release
# Or manually create tag
git tag -a desktop-v0.2.1 -m "Release Desktop v0.2.1"
git push origin desktop-v0.2.1The Desktop release will automatically trigger GitHub Actions to build installers for macOS, Windows, and Linux.
✅ MVP Complete!
- ✅ Core infrastructure
- ✅ 7 official adapters (OpenAI, Anthropic, DeepSeek, Moonshot, Zhipu, Qwen, Gemini)
- ✅ Bidirectional conversion
- ✅ Type-safe TypeScript
- ✅ Unit tests
- ✅ Working examples
- Complete streaming support for all adapters
- Add more unit tests (target: 80%+ coverage)
- Create documentation site (fumadocs)
- Add integration tests
- Publish to npm
- Add more adapters (community contributions welcome!)
We welcome contributions! Please see our Contributing Guide for details.
MIT © isboyjc
This project is inspired by the excellent work of:
Made with ❤️ by the Amux team