Skip to content
/ amux Public

Bidirectional LLM API adapter with IR pattern. Seamlessly convert between OpenAI, Anthropic, DeepSeek, and more. TypeScript, zero deps | 基于 IR 模式的双向 LLM API 适配器,无缝转换 OpenAI、Anthropic、DeepSeek 等多个提供商。TypeScript类型支持、零依赖。

Notifications You must be signed in to change notification settings

isboyjc/amux

Repository files navigation

Amux

Bidirectional LLM API Adapter - A unified infrastructure for converting between different LLM provider APIs

License: MIT TypeScript pnpm

🌟 Features

  • 🔄 Bidirectional Conversion: Convert between any LLM provider API formats
  • 🎯 Type-Safe: Full TypeScript support with comprehensive type definitions
  • 🔌 Extensible: Easy to add custom adapters for new providers
  • ⚡ Zero Dependencies: Core package has zero runtime dependencies
  • 🧪 Well-Tested: High test coverage with comprehensive test suites
  • 📦 Tree-Shakable: Optimized for modern bundlers
  • 🚀 8 Official Adapters: OpenAI, Anthropic, DeepSeek, Moonshot, Zhipu, Qwen, Gemini, MiniMax

🚀 Quick Start

Installation

# Install core package and adapters you need
pnpm add @amux.ai/llm-bridge @amux.ai/adapter-openai @amux.ai/adapter-anthropic

Basic Usage

import { createBridge } from '@amux.ai/llm-bridge'
import { openaiAdapter } from '@amux.ai/adapter-openai'
import { anthropicAdapter } from '@amux.ai/adapter-anthropic'

// Create a bridge: OpenAI format in → Anthropic API out
const bridge = createBridge({
  inbound: openaiAdapter,
  outbound: anthropicAdapter,
  config: {
    apiKey: process.env.ANTHROPIC_API_KEY,
    baseURL: 'https://api.anthropic.com'
  }
})

// Send OpenAI-format request, get OpenAI-format response
// But actually calls Claude API under the hood
const response = await bridge.chat({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello!' }]
})

console.log(response.choices[0].message.content)

📦 Packages

Package Description Version Status
@amux.ai/llm-bridge Core IR and adapter interfaces - ✅ Stable
@amux.ai/adapter-openai OpenAI adapter - ✅ Stable
@amux.ai/adapter-anthropic Anthropic (Claude) adapter - ✅ Stable
@amux.ai/adapter-deepseek DeepSeek adapter - ✅ Stable
@amux.ai/adapter-moonshot Moonshot (Kimi) adapter - ✅ Stable
@amux.ai/adapter-zhipu Zhipu AI (GLM) adapter - ✅ Stable
@amux.ai/adapter-qwen Qwen adapter - ✅ Stable
@amux.ai/adapter-google Google Gemini adapter - ✅ Stable
@amux.ai/adapter-minimax MiniMax adapter - ✅ Stable
@amux.ai/utils Shared utilities - ✅ Stable

🏗️ Architecture

┌─────────────────────────────────────────────────────────┐
│                    Your Application                      │
└────────────────────┬────────────────────────────────────┘
                     │ OpenAI Format Request
                     ▼
┌─────────────────────────────────────────────────────────┐
│                   Inbound Adapter                        │
│              (Parse OpenAI → IR)                         │
└────────────────────┬────────────────────────────────────┘
                     │ Intermediate Representation (IR)
                     ▼
┌─────────────────────────────────────────────────────────┐
│                      Bridge                              │
│         (Validation & Compatibility Check)               │
└────────────────────┬────────────────────────────────────┘
                     │ IR
                     ▼
┌─────────────────────────────────────────────────────────┐
│                  Outbound Adapter                        │
│              (IR → Build Anthropic)                      │
└────────────────────┬────────────────────────────────────┘
                     │ Anthropic Format Request
                     ▼
┌─────────────────────────────────────────────────────────┐
│                  Anthropic API                           │
└─────────────────────────────────────────────────────────┘

🎯 Use Cases

  • Multi-Provider Support: Build applications that work with multiple LLM providers
  • Provider Migration: Easily migrate from one provider to another
  • Cost Optimization: Route requests to different providers based on cost/performance
  • Fallback Strategy: Implement automatic fallback to alternative providers
  • Testing: Test your application with different providers without code changes

📚 Examples

All Adapters

import { createBridge } from '@amux.ai/llm-bridge'
import { openaiAdapter } from '@amux.ai/adapter-openai'
import { anthropicAdapter } from '@amux.ai/adapter-anthropic'
import { deepseekAdapter } from '@amux.ai/adapter-deepseek'
import { moonshotAdapter } from '@amux.ai/adapter-moonshot'
import { qwenAdapter } from '@amux.ai/adapter-qwen'
import { geminiAdapter } from '@amux.ai/adapter-google'

// OpenAI → Anthropic
const bridge1 = createBridge({
  inbound: openaiAdapter,
  outbound: anthropicAdapter,
  config: { apiKey: process.env.ANTHROPIC_API_KEY }
})

// Anthropic → DeepSeek
const bridge2 = createBridge({
  inbound: anthropicAdapter,
  outbound: deepseekAdapter,
  config: { apiKey: process.env.DEEPSEEK_API_KEY }
})

// Any combination works!

Streaming

const bridge = createBridge({
  inbound: openaiAdapter,
  outbound: anthropicAdapter,
  config: { apiKey: process.env.ANTHROPIC_API_KEY }
})

for await (const chunk of bridge.chatStream({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Tell me a story' }],
  stream: true
})) {
  console.log(chunk)
}

Tool Calling

const response = await bridge.chat({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'What is the weather in SF?' }],
  tools: [{
    type: 'function',
    function: {
      name: 'get_weather',
      description: 'Get the current weather',
      parameters: {
        type: 'object',
        properties: {
          location: { type: 'string' }
        },
        required: ['location']
      }
    }
  }]
})

🧪 Testing

# Run all tests
pnpm test

# Run tests for specific package
cd packages/llm-bridge && pnpm test

# Run tests with coverage
pnpm test:coverage

🛠️ Development

# Install dependencies
pnpm install

# Build all packages
pnpm build

# Run example
cd examples/basic && pnpm start

# Type check
pnpm typecheck

# Lint
pnpm lint

📦 Release Process

NPM Packages

For publishing npm packages, use the manual publish workflow:

# 1. Add changeset (describe your changes)
pnpm changeset

# 2. Update versions and generate CHANGELOG
pnpm changeset:version

# 3. Commit and push version updates
git add .
git commit -m "chore: bump package versions"
git push

# 4. Build packages
pnpm --filter "./packages/**" build

# 5. Publish to npm (requires npm login)
pnpm changeset:publish

# 6. Push generated tags
git push --tags

Desktop App

For releasing the Desktop application:

# Use the release script (recommended)
pnpm release

# Or manually create tag
git tag -a desktop-v0.2.1 -m "Release Desktop v0.2.1"
git push origin desktop-v0.2.1

The Desktop release will automatically trigger GitHub Actions to build installers for macOS, Windows, and Linux.

📊 Project Status

MVP Complete!

  • ✅ Core infrastructure
  • ✅ 7 official adapters (OpenAI, Anthropic, DeepSeek, Moonshot, Zhipu, Qwen, Gemini)
  • ✅ Bidirectional conversion
  • ✅ Type-safe TypeScript
  • ✅ Unit tests
  • ✅ Working examples

🗺️ Roadmap

  • Complete streaming support for all adapters
  • Add more unit tests (target: 80%+ coverage)
  • Create documentation site (fumadocs)
  • Add integration tests
  • Publish to npm
  • Add more adapters (community contributions welcome!)

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

📄 License

MIT © isboyjc

🙏 Acknowledgments

This project is inspired by the excellent work of:


Made with ❤️ by the Amux team

About

Bidirectional LLM API adapter with IR pattern. Seamlessly convert between OpenAI, Anthropic, DeepSeek, and more. TypeScript, zero deps | 基于 IR 模式的双向 LLM API 适配器,无缝转换 OpenAI、Anthropic、DeepSeek 等多个提供商。TypeScript类型支持、零依赖。

Resources

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors