A FastAPI application for managing users and sending asynchronous notifications using Celery and Redis. This system demonstrates a production-ready architecture with task queuing, database persistence, and comprehensive API documentation.
- User Management: Create, retrieve, and list users with unique email validation
- Notifications System: Send individual or bulk notifications to users
- Async Task Processing: Celery workers handle notification distribution asynchronously
- Database Persistence: SQLAlchemy ORM with SQLite/PostgreSQL support
- API Documentation: Interactive Swagger UI and ReDoc documentation
- Load Testing: Locust integration for performance testing
- Docker Support: Complete Docker Compose setup for local development
┌─────────────────────────────────────────────────────────┐
│ FastAPI Server (8000) │
│ • User endpoints │
│ • Notification endpoints │
│ • Async task dispatch │
└────────────────┬──────────────────────────────────────┘
│
┌──────────┴──────────┬──────────────────┐
│ │ │
┌───▼───┐ ┌────▼─────┐ ┌────▼────┐
│ DB │ │ Redis │ │ Celery │
│SQLite │ │ Broker │ │ Workers │
│/PG │ │(6379) │ │ │
└───────┘ └──────────┘ └─────────┘
- Docker & Docker Compose
- Python 3.11+ (for local development)
- Redis (handled by Docker)
- PostgreSQL (optional, handled by Docker)
- Clone and navigate to the project:
cd queuetasks- Start all services:
docker-compose up --buildThis will start:
- API: http://localhost:8000
- PostgreSQL Database: localhost:5432
- Redis: localhost:6379
- Celery Worker: Processing background tasks
- Create virtual environment (Windows PowerShell):
python -m venv env
.\env\Scripts\Activate.ps1- Install dependencies:
pip install -r requirements.txt- Set environment variables:
$env:DATABASE_URL="sqlite:///./app.db"
$env:REDIS_URL="redis://localhost:6379/0"- Run migrations (create tables):
python -c "from app.database import Base, engine; from app import models; Base.metadata.create_all(bind=engine)"- Start Redis (required):
redis-server- Start Celery worker (in another terminal):
python -m celery -A app.worker:celery_app worker --loglevel=info- Start API server (in another terminal):
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000Once the server is running, access:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- OpenAPI Schema: http://localhost:8000/openapi.json
Create a user:
curl -X POST "http://localhost:8000/users/" \
-H "Content-Type: application/json" \
-d '{"email":"juan@example.com","name":"Juan Arango"}'Notify all users:
curl -X POST "http://localhost:8000/notifications/notify_all/?title=Hello&message=Welcome%20to%20our%20platform"Get user notifications:
curl "http://localhost:8000/users/1/notifications"Mark notification as read:
curl -X POST "http://localhost:8000/notifications/1/read"Run performance tests against the API:
locust -f locustfile.py --host=http://localhost:8000This will:
- Create users with unique emails (3:1 ratio)
- Trigger bulk notifications (1:1 ratio)
- Generate realistic load patterns
Access the Locust UI at http://localhost:8089
queuetasks/
├── app/
│ ├── __init__.py
│ ├── main.py # Endpoints
│ ├── models.py # SQLAlchemy models
│ ├── schemas.py # Pydantic schemas
│ ├── crud.py # Database ops
│ ├── database.py # Database conf
│ ├── worker.py # Celery worker's tasks
│ └── tests.py # Unit tests
├── docker-compose.yml # Services config
├── Dockerfile # API container image
├── requirements.txt # Python dependencies
├── locustfile.py # Locust Load tests
└── README.md
| Variable | Default | Description |
|---|---|---|
DATABASE_URL |
sqlite:///./app.db |
Database connection string |
REDIS_URL |
redis://localhost:6379/0 |
Redis broker URL |
Run the test suite:
docker-compose run --rm api pytest app/tests.py -v
pytest app/tests.py -v- Database: SQLite for development, PostgreSQL recommended for production
- Task Queue: Redis provides high-performance message brokering
- Async Processing: Celery workers handle notifications without blocking the API
- Scalability: Worker processes can be scaled horizontally
- FastAPI: Modern web framework for building APIs
- SQLAlchemy: Python ORM for database operations
- Celery: Distributed task queue
- Redis: Message broker and result backend
- Pydantic: Data validation using Python type annotations
- Uvicorn: ASGI server
- Pytest: Testing framework
- Locust: Load testing tool
- Email validation is enforced on user creation
- Duplicate emails rejected with 400 HTTP code status
- User IDs are validated before notification creation
- Database credentials should be managed via environment variables