Skip to content

stuagano/chainlit

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1,268 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Welcome to Chainlit 👋

Build python production-ready conversational AI applications in minutes, not weeks ⚡️

chat on Discord Twitter Downloads Contributors CI

⚠️ Notice: Chainlit is now community-maintained.

As of May 1st 2025, the original Chainlit team has stepped back from active development. The project is maintained by @Chainlit/chainlit-maintainers under a formal Maintainer Agreement.

Maintainers are responsible for code review, releases, and security.
Chainlit SAS provides no warranties on future updates.

Want to help maintain? Apply here →

WebsiteDocumentationChainlit HelpCookbook

Chainlit%2Fchainlit | Trendshift

overview-chainlit.mp4

Installation

Open a terminal and run:

pip install chainlit
chainlit hello

If this opens the hello app in your browser, you're all set!

Bootstrap the local monorepo

Clone this repository and populate the shared .env file before running any of the smoke tests:

python3 scripts/start_local.py --smoke-test

The helper script mirrors docs/local-setup.md by:

  1. Reading the credential inventory from .env.example so API keys remain DRY across developers and CI.
  2. Reusing any values already set in .env or exported in your shell, prompting only for the remainder (press Enter to keep documented defaults or leave them blank). Pass --non-interactive to skip prompts entirely in automation while still surfacing any empty secrets at the end.
  3. Running scripts/smoke_test.py when invoked with --smoke-test so the repository matches the documented workflow across local shells, GitHub Actions, and Cloud Build. The helper waits for chainlit hello --ci --headless to report readiness, verifies it serves HTTP, and tears it down automatically so automation never hangs. It still falls back to pnpm install/uv sync --extra mypy without --frozen if drift is detected and reminds you to reconcile the lockfiles so mypy stubs remain available for Husky hooks.

On GCP we recommend storing the same variables in Secret Manager and mounting them into Cloud Run/Cloud Functions to avoid duplicating secrets per environment. After generating .env, mirror it to Secret Manager so automation stays DRY:

python3 scripts/sync_env_to_gcp.py --create

The sync helper writes the exact .env content documented above, enabling Cloud Build (see cloudbuild/smoke-test.yaml) and GitHub Actions to run the shared smoke test without redefining secrets.

🗂️ Documentation inventory

File Purpose
README.md High-level project overview and quickstart commands.
docs/local-setup.md Minimal smoke test to validate local installs using shared .env values and reproducible package managers.
docs/llms.txt Checklist of LLM provider environment variables to keep in centralized secrets (locally via .env, in production via GCP Secret Manager).
AGENTS.md Guidance for AI contributors to keep workflows DRY and aligned with GCP deployment practices.

Development version

The latest in-development version can be installed straight from GitHub with:

pip install git+https://github.com/Chainlit/chainlit.git#subdirectory=backend/

(Requires Node and pnpm installed on the system.)

🚀 Quickstart

🐍 Pure Python

Create a new file demo.py with the following code:

import chainlit as cl


@cl.step(type="tool")
async def tool():
    # Fake tool
    await cl.sleep(2)
    return "Response from the tool!"


@cl.on_message  # this function will be called every time a user inputs a message in the UI
async def main(message: cl.Message):
    """
    This function is called every time a user inputs a message in the UI.
    It sends back an intermediate response from the tool, followed by the final answer.

    Args:
        message: The user's message.

    Returns:
        None.
    """


    # Call the tool
    tool_res = await tool()

    await cl.Message(content=tool_res).send()

Now run it!

chainlit run demo.py -w

Quick Start

📚 More Examples - Cookbook

You can find various examples of Chainlit apps here that leverage tools and services such as OpenAI, Anthropiс, LangChain, LlamaIndex, ChromaDB, Pinecone and more.

🌟 Google Gemini, Vertex AI & ADK support

Chainlit now ships with first-class support for the official Google GenAI SDK so you can build Gemini or Vertex AI agents without additional glue code. Install google-genai (or the legacy google-generativeai) alongside your Chainlit app and call:

import chainlit as cl
import os

cl.instrument_google_genai()

from google import genai

client = genai.Client(
    api_key=os.environ.get("GEMINI_API_KEY"),
    # or: project=os.environ["VERTEX_PROJECT_ID"], location="us-central1"
)

response = client.responses.generate(
    model="models/gemini-1.5-flash",
    contents="Draft a warm welcome message for new users",
)

@cl.on_message
async def on_message(message: cl.Message):
    await cl.Message(content=response.output_text).send()

Every SDK call (including Agent Developer Kit client.agents.* helpers) automatically appears as an LLM step inside the Chainlit UI, capturing prompts, outputs, timing information and metadata. Require GEMINI_API_KEY, VERTEX_PROJECT_ID, or other credentials from your end users via config.toml's user_env list to collect them securely at runtime.

Tell us what you would like to see added in Chainlit using the Github issues or on Discord.

💁 Contributing

As an open-source initiative in a rapidly evolving domain, we welcome contributions, be it through the addition of new features or the improvement of documentation.

For detailed information on how to contribute, see here.

📃 License

Chainlit is open-source and licensed under the Apache 2.0 license.

About

Build Conversational AI in minutes ⚡️

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • TypeScript 49.5%
  • Python 48.0%
  • CSS 1.6%
  • Other 0.9%