A production-ready Fastify 5 boilerplate built on Clean Architecture, CQRS, DDD, and functional programming. Designed as a starting point for real-world applications, the architecture is framework-agnostic at its core β the patterns and boundaries translate to any language or framework.
- Features
- Prerequisites
- Getting Started
- Available Scripts
- Running with Docker
- API Endpoints
- Architecture
- Folder Structure
- OpenTelemetry
- Testing
- Client Types Package
- CI/CD Pipeline
- AI-Assisted Development
- Why Biome over ESLint + Prettier?
- Useful Resources
- Contributing
- License
| Category | Details |
|---|---|
| Runtime | Native TypeScript via Node.js >= 24 type stripping β no build step, no transpiler |
| Framework | Fastify 5 with Awilix DI and Pino logging |
| API | REST (TypeBox schemas, Swagger UI) + GraphQL (Mercurius, GraphiQL in dev) |
| Database | Postgres.js client + DBMate migrations & seeds |
| Security | @fastify/helmet, @fastify/under-pressure for back-pressure |
| Telemetry | Vendor-agnostic OpenTelemetry with auto-instrumentation (disabled by default) |
| Linting | Biome β single tool for linting, formatting, and import sorting |
| Architecture | dependency-cruiser validates layer boundaries at CI time |
| Release | Husky + Commitlint + Semantic Release |
| Client types | REST (OpenAPI) and GraphQL types auto-generated and published to npm on every release |
| Testing | E2E with Cucumber (Gherkin), unit/integration with node:test, load tests with k6 |
| Docker | Production-ready multi-stage Dockerfile (Alpine, non-root, health check) + Docker Compose |
| AI-Ready | AGENTS.md β architecture rules and coding conventions for AI assistants |
| Tool | Notes |
|---|---|
| Node.js | >= 24 β required for native TypeScript execution. A .nvmrc is included β run fnm use or nvm use |
| pnpm | >= 10 β package manager (corepack enable to activate) |
| Docker | Used to run PostgreSQL via Docker Compose. Alternatively, use a local Postgres install |
# 1. Scaffold from the template
npx degit marcoturi/fastify-boilerplate my-app
cd my-app
# 2. Install dependencies
pnpm install
# 3. Create your .env file
pnpm create:env # copies .env.example β .env
# 4. Start PostgreSQL (pick one)
docker compose up postgres -d # via Docker Compose
# β or use a local Postgres and adjust .env values β
# 5. Run database migrations
pnpm db:migrate
# 6. Start the dev server (with watch mode and pretty logs)
pnpm startThe server starts at http://localhost:3000 by default. See API Endpoints for what's available.
| Script | Description |
|---|---|
pnpm start |
Start dev server with watch mode and pretty-printed logs |
pnpm start:prod |
Start production server (no watch, no pretty-print) |
pnpm create:env |
Copy .env.example to .env (fails if .env already exists) |
| Script | Description |
|---|---|
pnpm check |
Run lint + format check + type check (use this before committing) |
pnpm check:fix |
Same as check but auto-fixes lint and format issues |
pnpm format |
Auto-format all files with Biome |
pnpm lint |
Run Biome linter with auto-fix |
pnpm type:check |
TypeScript type checking (tsc --noEmit) |
pnpm deps:validate |
Validate architecture layer boundaries with dependency-cruiser |
pnpm deps:graph |
Generate a dependency graph SVG in doc/ |
| Script | Description |
|---|---|
pnpm test |
Run unit tests (alias for test:unit) |
pnpm test:unit |
Run unit and integration tests with node:test |
pnpm test:coverage |
Run unit tests with c8 coverage |
pnpm test:e2e |
Run E2E tests with Cucumber (requires running Postgres) |
| Script | Description |
|---|---|
pnpm db:migrate |
Apply pending migrations |
pnpm db:create-migration |
Create a new migration file |
pnpm db:seed |
Run database seeds |
pnpm db:create-seed |
Create a new seed file |
| Script | Description |
|---|---|
pnpm generate:types |
Generate REST and GraphQL client types (requires running server + DB) |
pnpm create:env # create .env from .env.example (if not done already)
docker compose up # builds the app image and starts all servicesdocker build -t fastify-boilerplate .
docker run -p 3000:3000 --env-file .env -e HOST=0.0.0.0 fastify-boilerplate- Multi-stage build β dependencies installed in an isolated stage for optimal layer caching
- Node Alpine β small footprint, native TypeScript execution (no build step)
- Non-root user β runs as an unprivileged
fastifyuser (UID 1001) - dumb-init β proper PID 1 signal forwarding for graceful shutdown
- HEALTHCHECK β built-in Docker health check against
/healthevery 30 seconds
| Endpoint | Description |
|---|---|
GET /health |
Health check (@fastify/under-pressure) |
/api/... |
All REST routes are prefixed with /api (e.g. /api/v1/users) |
GET /api-docs |
Swagger UI (interactive API documentation) |
GET /api-docs/json |
OpenAPI 3.1.0 JSON spec |
POST /graphql |
GraphQL endpoint (Mercurius) |
GET /graphql |
GraphiQL IDE (development only) |
Diagram adapted from Domain-Driven Hexagon
Project-level:
- Adaptable complexity β the structure scales up or down by adding or removing layers to match the application's actual needs.
- Future-proofing β framework code and business logic are separated. Dependencies are well-established and minimal.
- Functional programming first β composition and factory functions over classes and inheritance.
- Microservices-ready β vertical slices, path aliases, and CQRS make it straightforward to extract a module into its own service later.
Code-level:
- Framework-agnostic core β business logic has no Fastify dependency. Fastify concerns stay in routes.
- Protocol-agnostic handlers β command/query handlers serve REST, GraphQL, gRPC, or CLI equally.
- Database-agnostic domain β SQL stays in repository files. Handlers interact with data through repository ports (interfaces).
- Inward dependency flow β outer layers depend on inner layers, never the reverse:
Route β Handler β Domain β Repository.
Based on:
- Domain-Driven Design (DDD)
- Hexagonal (Ports and Adapters) Architecture
- Clean Architecture
- Onion Architecture
- SOLID Principles
- Vertical Slice Architecture
- Common Closure Principle (CCP)
Each module maps to a domain concept and lives in its own folder under src/modules/. Modules follow the vertical slice architecture β everything a feature needs is co-located.
Key rules:
- No direct imports between modules. Cross-module communication uses the CQRS buses (commands/queries for request-response, events for fire-and-forget).
- Extractable β any module can be pulled into a separate microservice. The CQRS handler boundary becomes the network boundary.
- If two modules are too "chatty", they probably belong together β merge them.
Each layer has a single responsibility:
Route β handles the HTTP/GraphQL/gRPC request. Validates input, formats the response. No business logic.
All REST routes are prefixed with
/api(configured insrc/server/index.ts).
Example: find-users.route.ts
Command/Query Handler β orchestrates the use case. Receives a command or query, calls domain services and repositories through ports, returns a result. One handler per use case (e.g. CreateUser, FindUsers).
Benefits of the CQRS bus pattern:
- Middlewares β cross-cutting concerns (auth, caching, tracing, rate limiting) plug in between route and handler. Middleware targeting is pattern-based (e.g.
users/*for all user commands,users/createfor a specific one). See middlewares.ts. - Decoupling β modules communicate through the bus instead of direct imports, making future extraction to microservices trivial.
Example: find-users.handler.ts
Domain Service β pure business logic. Computes properties, enforces invariants, composes entities. No infrastructure dependencies.
Example: user.domain.ts
Repository β data access. Converts between domain models and database rows. All SQL lives here. Implements a port (interface) defined alongside it.
Example: user.repository.ts
Guideline: use as many or as few layers as needed. Not every feature requires a domain service β simpler CRUD operations can go straight from handler to repository.
.
βββ db/
β βββ migrations/ β SQL migration files (DBMate)
β βββ seeds/ β SQL seed files (DBMate)
βββ tests/
β βββ <feature>/
β β βββ <scenario>.feature β Gherkin E2E scenarios
β β βββ <scenario>.k6.ts β k6 load test scripts
β βββ shared/ β Shared step definitions
β βββ support/ β Test server, hooks, custom world
βββ client/ β Generated REST + GraphQL client types (npm package)
βββ scripts/ β Type generation scripts
βββ src/
βββ instrumentation.ts β OpenTelemetry setup (loaded via --import)
βββ config/ β Environment validation (env-schema + TypeBox)
βββ modules/
β βββ <feature>/
β β βββ commands/
β β β βββ <command>/
β β β βββ command.handler.ts β Command handler
β β β βββ command.route.ts β REST route
β β β βββ command.resolver.ts β GraphQL resolver
β β β βββ command.graphql-schema.ts β GraphQL type definitions
β β β βββ command.schema.ts β TypeBox request/response schemas
β β βββ queries/
β β β βββ <query>/
β β β βββ query.handler.ts β Query handler
β β β βββ query.route.ts β REST route
β β β βββ query.resolver.ts β GraphQL resolver
β β β βββ query.graphql-schema.ts β GraphQL type definitions
β β β βββ query.schema.ts β TypeBox request/response schemas
β β βββ database/
β β β βββ feature.repository.port.ts β Repository interface (port)
β β β βββ feature.repository.ts β Repository implementation (adapter)
β β βββ domain/
β β β βββ feature.domain.ts β Domain service
β β β βββ feature.errors.ts β Domain-specific errors
β β β βββ feature.types.ts β Domain types
β β βββ dtos/
β β β βββ feature.graphql-schema.ts β Shared GraphQL schema
β β β βββ feature.response.dto.ts β Shared response DTO
β β βββ index.ts β Action creators, DI declarations
β β βββ feature.mapper.ts β Entity β DB model β DTO mapper
βββ server/
β βββ index.ts β Fastify instance setup
β βββ plugins/ β Fastify plugins (swagger, CORS, error handler, CQRS, etc.)
βββ shared/
βββ cqrs/ β Command/Query/Event bus, middlewares
βββ db/ β Postgres connection, transaction helpers, repository base
βββ exceptions/ β Base exception classes
βββ utils/ β Cross-cutting utilities
The project ships with a vendor-agnostic OpenTelemetry setup in src/instrumentation.ts. It uses the standard OTLP protocol, so it works with any backend (Grafana, Datadog, Honeycomb, Jaeger, etc.) without code changes.
How it works:
- HTTP + Fastify β the SDK registers the ESM loader hook and initialises with
instrumentation-httpand@fastify/otel. Every request gets a trace span with route, method, status code, and lifecycle hooks. - CQRS β a tracing middleware in
src/shared/cqrs/otel-middleware.tswraps every command, query, and event in a span. Spans include the action type, bus kind, and correlation ID. - Disabled by default (
OTEL_SDK_DISABLED=true). When disabled,@opentelemetry/apireturns noop implementations β zero overhead.
To enable, set the standard OTel environment variables in your .env:
OTEL_SDK_DISABLED=false
OTEL_SERVICE_NAME=fastify-boilerplate
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318 # your OTLP collectorAll configuration uses standard OTel environment variables β no vendor lock-in.
Run with node:test. Test files live next to their source files as *.spec.ts.
pnpm test:unit # run tests
pnpm test:coverage # run with c8 coverageWritten in Gherkin and executed with Cucumber.js. Scenarios live in tests/<feature>/<scenario>.feature, step definitions in tests/<feature>/<feature>.steps.ts.
# Requires a running Postgres with migrations applied
pnpm test:e2eThe E2E test server is created via buildApp() (in tests/support/server.ts) β it boots a full Fastify instance without binding to a port, so tests run fast and don't conflict with a running dev server.
k6 scripts live alongside their feature's E2E tests.
Example: create-user.k6.ts
The release pipeline automatically generates REST (OpenAPI) and GraphQL client types and publishes them as the @marcoturi/fastify-boilerplate npm package. The version is kept in sync with the backend via semantic-release.
pnpm add -D @marcoturi/fastify-boilerplate// REST types (generated by openapi-typescript)
import type { paths, components } from '@marcoturi/fastify-boilerplate/rest';
// GraphQL types (generated by graphql-codegen)
import type { User, Query, Mutation } from '@marcoturi/fastify-boilerplate/graphql';To regenerate the types against a local server (requires running Postgres with migrations applied):
pnpm generate:typesThis starts the server, fetches the OpenAPI and GraphQL schemas, writes the type files to client/, and stops the server.
The project uses GitHub Actions with two workflows:
release.yml β runs on every push to main:
- Install dependencies (
pnpm install --frozen-lockfile) - Code quality checks (
pnpm check) - Unit tests (
pnpm test) - E2E tests (
pnpm test:e2e) against a Postgres service container - Generate client types (
pnpm generate:types) - Publish release via semantic-release (changelog, GitHub release, npm client package)
codeql-analysis.yml β runs on pushes and PRs to main:
- GitHub CodeQL security analysis for JavaScript/TypeScript
This project ships with an AGENTS.md file β a comprehensive guide for AI coding assistants. It documents the architecture, CQRS patterns, coding conventions, and common pitfalls so that tools like Cursor, Claude Code, and GitHub Copilot can generate code that follows the project's established patterns.
AI assistants automatically pick up AGENTS.md and apply the conventions without manual prompting.
This project uses Biome as a single tool for linting, formatting, and import sorting:
- One tool, zero plugins β no
@typescript-eslint/parser,eslint-config-prettier,eslint-plugin-import, or other ecosystem packages to keep in sync. - Fast β written in Rust, orders of magnitude faster than ESLint + Prettier. Noticeable in CI and pre-commit hooks.
- Stable β no more breakage from mismatched plugin versions or peer dependency conflicts across the ESLint ecosystem.
- Mature β covers the vast majority of rules that ESLint + typescript-eslint provide, with a growing community and clear roadmap.
- Domain-Driven Hexagon β the primary inspiration for this project's architecture (adapted toward functional programming)
- react-redux boilerplate β companion frontend boilerplate by the same author
Contributions are welcome! This project uses Conventional Commits enforced by Commitlint and Husky.
- Fork and clone the repo
- Create a branch:
git checkout -b your-feature - Make your changes
- Run
pnpm checkto validate lint, format, and types - Run
pnpm test(andpnpm test:e2eif your change touches API behavior) - Commit using Conventional Commits format (e.g.
feat: add user roles) - Open a Pull Request
