Skip to content

adolfoarmas/queuetasks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Notification Queue Tasks API

A FastAPI application for managing users and sending asynchronous notifications using Celery and Redis. This system demonstrates a production-ready architecture with task queuing, database persistence, and comprehensive API documentation.

Features

  • User Management: Create, retrieve, and list users with unique email validation
  • Notifications System: Send individual or bulk notifications to users
  • Async Task Processing: Celery workers handle notification distribution asynchronously
  • Database Persistence: SQLAlchemy ORM with SQLite/PostgreSQL support
  • API Documentation: Interactive Swagger UI and ReDoc documentation
  • Load Testing: Locust integration for performance testing
  • Docker Support: Complete Docker Compose setup for local development

Architecture

┌─────────────────────────────────────────────────────────┐
│                   FastAPI Server (8000)                 │
│  • User endpoints                                        │
│  • Notification endpoints                               │
│  • Async task dispatch                                  │
└────────────────┬──────────────────────────────────────┘
                 │
      ┌──────────┴──────────┬──────────────────┐
      │                     │                  │
  ┌───▼───┐           ┌────▼─────┐      ┌────▼────┐
  │  DB   │           │  Redis   │      │ Celery  │
  │SQLite │           │  Broker  │      │ Workers │
  │/PG    │           │(6379)    │      │         │
  └───────┘           └──────────┘      └─────────┘

Prerequisites

  • Docker & Docker Compose
  • Python 3.11+ (for local development)
  • Redis (handled by Docker)
  • PostgreSQL (optional, handled by Docker)

Installation & Setup

Using Docker Compose (Recommended)

  1. Clone and navigate to the project:
cd queuetasks
  1. Start all services:
docker-compose up --build

This will start:

  • API: http://localhost:8000
  • PostgreSQL Database: localhost:5432
  • Redis: localhost:6379
  • Celery Worker: Processing background tasks

Local Development Setup

  1. Create virtual environment (Windows PowerShell):
python -m venv env
.\env\Scripts\Activate.ps1
  1. Install dependencies:
pip install -r requirements.txt
  1. Set environment variables:
$env:DATABASE_URL="sqlite:///./app.db"
$env:REDIS_URL="redis://localhost:6379/0"
  1. Run migrations (create tables):
python -c "from app.database import Base, engine; from app import models; Base.metadata.create_all(bind=engine)"
  1. Start Redis (required):
redis-server
  1. Start Celery worker (in another terminal):
python -m celery -A app.worker:celery_app worker --loglevel=info
  1. Start API server (in another terminal):
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000

API Documentation

Once the server is running, access:

Example Usage

Create a user:

curl -X POST "http://localhost:8000/users/" \
  -H "Content-Type: application/json" \
  -d '{"email":"juan@example.com","name":"Juan Arango"}'

Notify all users:

curl -X POST "http://localhost:8000/notifications/notify_all/?title=Hello&message=Welcome%20to%20our%20platform"

Get user notifications:

curl "http://localhost:8000/users/1/notifications"

Mark notification as read:

curl -X POST "http://localhost:8000/notifications/1/read"

Load Testing with Locust

Run performance tests against the API:

locust -f locustfile.py --host=http://localhost:8000

This will:

  • Create users with unique emails (3:1 ratio)
  • Trigger bulk notifications (1:1 ratio)
  • Generate realistic load patterns

Access the Locust UI at http://localhost:8089

Project Structure

queuetasks/
├── app/
│   ├── __init__.py
│   ├── main.py              # Endpoints
│   ├── models.py            # SQLAlchemy models
│   ├── schemas.py           # Pydantic schemas
│   ├── crud.py              # Database ops
│   ├── database.py          # Database conf
│   ├── worker.py            # Celery worker's tasks
│   └── tests.py             # Unit tests
├── docker-compose.yml       # Services config
├── Dockerfile               # API container image
├── requirements.txt         # Python dependencies
├── locustfile.py            # Locust Load tests
└── README.md

Environment Variables

Variable Default Description
DATABASE_URL sqlite:///./app.db Database connection string
REDIS_URL redis://localhost:6379/0 Redis broker URL

Testing

Run the test suite:

docker-compose run --rm api pytest app/tests.py -v

pytest app/tests.py -v

Performance Considerations

  1. Database: SQLite for development, PostgreSQL recommended for production
  2. Task Queue: Redis provides high-performance message brokering
  3. Async Processing: Celery workers handle notifications without blocking the API
  4. Scalability: Worker processes can be scaled horizontally

Dependencies

  • FastAPI: Modern web framework for building APIs
  • SQLAlchemy: Python ORM for database operations
  • Celery: Distributed task queue
  • Redis: Message broker and result backend
  • Pydantic: Data validation using Python type annotations
  • Uvicorn: ASGI server
  • Pytest: Testing framework
  • Locust: Load testing tool

Security Notes

  • Email validation is enforced on user creation
  • Duplicate emails rejected with 400 HTTP code status
  • User IDs are validated before notification creation
  • Database credentials should be managed via environment variables

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors