Skip to content

tryosschat/openchat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

666 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
OpenChat Logo

OpenChat

Open-source AI chat workspace you can self-host or run on OpenChat Cloud

License Stars Forks Issues Pull Requests

FeaturesQuick StartDeploymentDocsContributing


Overview

OpenChat is a modern, open-source AI chat platform that combines a streaming TanStack Start frontend with Convex for real-time persistence and live sync. It features Better Auth (GitHub OAuth), OpenRouter integration for 100+ AI models, and a beautiful Tailwind v4 + shadcn design system.

The monorepo is managed with Turborepo and Bun, keeping the web app, Convex functions, shared packages, and browser extension in lockstep.

Features

🤖 Multi-Model AI Chat

  • 100+ models via OpenRouter
  • Streaming responses with live updates
  • Per-user API key support
  • Dynamic model pricing & cost tracking

⚡ Real-Time Sync

  • Convex-powered live data sync
  • Optimistic UI updates
  • Cross-device persistence
  • Offline-ready architecture

🔐 Secure Authentication

  • GitHub OAuth via Better Auth
  • Automatic user sync to Convex
  • Session-aware analytics
  • Encrypted API key storage

🎨 Modern UI/UX

  • Tailwind CSS v4 + shadcn/ui
  • Dark mode support
  • Command palette navigation
  • Responsive design

🔍 Web Search

  • Built-in web search integration
  • Valyu API powered
  • Daily usage limits
  • Search result citations

📦 Self-Hostable

  • Docker Compose ready
  • Dokploy integration
  • Vercel deployment
  • Complete control of your data

Tech Stack

Layer Technologies
Frontend TanStack Start (Vite), React 19, TypeScript, Tailwind CSS v4, shadcn/ui
Backend Convex (real-time database), Better Auth
AI OpenRouter (AI SDK 6), 100+ models
Tooling Bun 1.3+, Turborepo, Vitest, Oxlint
Analytics PostHog, Vercel Analytics
DevOps Docker, GitHub Actions

Repository Structure

openchat/
├── apps/
│   ├── web/              # TanStack Start frontend
│   │   ├── src/routes/   # File-based routing
│   │   ├── src/components/
│   │   └── src/stores/   # Zustand state management
│   ├── server/           # Convex backend
│   │   └── convex/       # Database schema & functions
│   └── extension/        # Browser extension (WXT + React)
├── docs/                 # Documentation
│   └── deployment/       # Docker & Dokploy guides
├── docker/               # Dockerfile images
└── scripts/              # Operational scripts

Quick Start

Prerequisites

  • Bun >= 1.3.0
  • Node.js >= 20 (for tooling)
  • Convex CLI (auto-installed during dev)

Installation

# Clone the repository
git clone https://github.com/opentech1/openchat.git
cd openchat

# Install dependencies
bun install

Configuration

  1. Copy environment templates:

    cp env.web.example apps/web/.env.local
    cp env.server.example apps/server/.env.local
  2. Configure required variables:

    • VITE_CONVEX_URL - Convex deployment URL
    • VITE_CONVEX_SITE_URL - Convex HTTP actions URL
    • GitHub OAuth credentials (in Convex dashboard)
    • BETTER_AUTH_SECRET - Session secret
  3. Optional variables:

    • OPENROUTER_API_KEY - Server API key for free tier
    • VALYU_API_KEY - Web search integration
    • VITE_POSTHOG_KEY - Analytics
    • REDIS_URL - Distributed rate limiting

Development

# Start full development environment
bun dev

# Frontend on http://localhost:3001
# Convex backend runs automatically

Common Commands

Command Description
bun dev Start full dev environment
bun dev:web Frontend only
bun dev:server Convex backend only
bun check Lint with Oxlint
bun check-types Type checking
bun test Run test suite
bun build Production build

Deployment

Vercel (Recommended)

Deploy the frontend to Vercel with Convex Cloud for the backend:

Deploy with Vercel

Docker Compose

# Production deployment
docker compose up -d

See docs/deployment/ for detailed Docker and Dokploy guides.

Rate Limiting

Single Instance (default): In-memory rate limiting, no setup required.

Multi-Instance: Enable Redis for distributed rate limiting:

# Set Redis URL
export REDIS_URL=redis://localhost:6379

# Or with Docker
docker run -d -p 6379:6379 redis:alpine

Documentation

Document Description
ENVIRONMENT.md Environment variables guide
deployment/ Docker & Dokploy setup
SYNC.md Real-time sync architecture
CONTRIBUTING.md Contribution guidelines
CODE_OF_CONDUCT.md Community standards

Star History

Star History Chart

Sponsors

We're grateful to our sponsors who help make OpenChat possible:

Convex Greptile GitBook Sentry Graphite

Become a Sponsor

Contributors

Thanks to all the amazing people who have contributed to OpenChat!

Contributors

Contributing

We welcome contributions! Please see our Contributing Guide for details.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Development Guidelines

  • Use TypeScript with strict mode
  • Follow the existing code style (Oxlint enforced)
  • Write tests for new features
  • Update documentation as needed
  • Use conventional commits

Community

Security

Found a security vulnerability? Please report it responsibly by emailing security@openchat.dev or through our security policy.

License

OpenChat is open-source software licensed under the GNU Affero General Public License v3.


Built with ❤️ by the OpenChat community

Star on GitHub

About

An open source ai platform being built for everyone ❤️

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 11