FastAPI wrapper for Map Action Model deployment.
Developed with the software and tools below.
Table of Contents
Model_deploy is a versatile open-source project designed for seamless deployment and scalable management of machine learning models. Leveraging FastAPI, Celery, and Transformers, it offers robust features such as asynchronous task management, context building, and image classification. With Dockerized environments, CI/CD pipelines, and PostgreSQL integration, Model_deploy ensures efficient ML deployment with enhanced maintainability and scalability. This project's value proposition lies in simplifying ML model deployment processes while enabling reliable and performance-driven AI applications. Developer Documentation
The system uses FastAPI endpoints to make predictions using computer vision models and generate summaries using a Large Language Model (LLM).
- The user submits an image through the FastAPI endpoint.
- The image is processed by a Convolutional Neural Network (CNN) for classification.
- Based on the CNN output, relevant context is retrieved from the database.
- The LLM generates a summary using the image classification and retrieved context.
- The results are returned to the user and stored in the database.
| Feature | Description | |
|---|---|---|
| βοΈ | Architecture | Utilizes FastAPI for API creation, Celery for asynchronous tasks, and PostgreSQL for database operations. |
| πΌοΈ | Image Processing | Implements CNN-based image classification for environmental issue detection. |
| π§ | LLM Integration | Incorporates Large Language Models for context-aware summarization and response generation. |
| π | Integrations | Connects with OpenAI models, uses Redis for Celery task queuing, and implements GitHub Actions for CI/CD. |
| π§© | Modularity | Features a modular codebase with separate services for CNN processing, LLM operations, and API handling. |
| π‘οΈ | Security | Implements secure practices for handling sensitive information and maintains Docker secrets. |
| π¦ | Dependencies | Relies on key libraries including FastAPI, Celery, Transformers, PyTorch, and PostgreSQL for robust ML model deployment. |
βββ Model_deploy/
βββ .github
β βββ workflows
βββ app
β βββ apis
β βββ models
β βββ services
β βββ analysis
β βββ celery
β βββ cnn
β βββ llm
β βββ websockets
βββ config
β βββ redis
βββ cv_model
βββ docs
βββ documents
βββ test
β βββ test_apis
β βββ test_celery
β βββ test_cnn
β βββ test_database
β βββ test_image_model
β βββ test_llm
βββ vector_index
βββ Dockerfile
βββ Dockerfile.CI
βββ Dockerfile.Celery
βββ LICENSE
βββ README.md
βββ _cd_pipeline.yaml
βββ _ci_pipeline.yml
βββ mkdocs.yml
βββ pytest.ini
βββ requirements.txtThis structure highlights the main components of the Model_deploy project:
app/: Contains the core application codeapis/: API endpointsmodels/: Data modelsservices/: Various services including CNN, LLM, and Celery tasks
config/: Configuration filescv_model/: Computer Vision model filesdocs/: Documentation filesdocuments/: Document storage for LLM processingtest/: Test suites for different componentsvector_index/: Vector storage for LLM- Docker-related files for containerization
- CI/CD pipeline configurations
- Documentation and licensing files
.
| File | Summary |
|---|---|
| requirements.txt | Lists Python package dependencies in requirements.txt for seamless project setup and reproducibility. Key libraries include fastapi, celery, transformers, and uvicorn to support ML deployment. Enhances project scalability and maintainability by managing package versions efficiently. |
| Dockerfile.Celery | Builds a Docker image for Celery worker, leveraging Python 3.10.13, to manage asynchronous tasks in the Model_deploy project. Inherits project dependencies from requirements.txt while ensuring a streamlined environment setup for seamless task execution. |
| Dockerfile | Enables deploying a Python application using Uvicorn server, handling data processing requests. Utilizes Docker for portability, installs dependencies, and configures the execution environment. Dynamically serves the app on port 8001 in the container. |
| Dockerfile.CI | Builds Python environment, installs project dependencies, and runs test coverage using pytest in the CI pipeline for Model_deploy. |
| _cd_pipeline.yaml | Sets up Docker services for a FastAPI app, Redis, and Celery workers with networking configurations in a micro-services environment. Enables communication between services for seamless deployment and scalability. |
| _ci_pipeline.yml | Automates creation and configuration of a CI service within the Model_deploy repository. Orchestrates building a Docker container for testing purposes based on the specified Dockerfile.CI. Integrates environment variables for seamless deployment. |
app
| File | Summary |
|---|---|
| main.py | Initializes FastAPI app with CORS middleware.-Connects to the database on app startup.-Gracefully disconnects from the database on app shutdown.-Includes main_routers APIs under /api1 prefix. |
| database.py | Establishes a connection to a PostgreSQL database within the Model_deploy repo's app module. Leveraging the databases library, it initializes a database instance with a predefined URL for subsequent data operations across the ML deployment system. |
app.services.llm
| File | Summary |
|---|---|
| llm.py | Contains core LLM functionality |
app.services.cnn
| File | Summary |
|---|---|
| cnn_preprocess.py | Implements image preprocessing for the CNN model, including resizing and normalization. |
| cnn.py | Contains the main CNN prediction logic, using a pre-trained VGG16 model to classify environmental issues in images. |
| cnn_model.py | Defines the CNN model architecture, adapting VGG16 for the specific classification task. |
app.apis
| File | Summary |
|---|---|
| main_router.py | Handles image prediction, contextualization, and data insertion. Utilizes FastAPI, requests, and Celery for async tasks. Fetches images, processes predictions, and stores results in the Mapapi_prediction table. Resilient to exceptions with proper error handling. |
app.models
| File | Summary |
|---|---|
| image_model.py | Defines ImageModel with image_name, sensitive_structures, and incident_id attributes. |
test.apis
| File | Summary |
|---|---|
| test_main_router.py | Verifies FastAPI endpoint functionality by simulating HTTP requests to ensure the Index route returns a 200 status code and correct JSON response. |
.github.workflows
| File | Summary |
|---|---|
| ci_cd.yml | Handles continuous integration and deployment via GitHub Actions. Runs test suites to validate code quality, builds and pushes Docker images, and deploys the ML model API on master branch pushes. Includes secret handling for Docker Hub credentials. |
System Requirements:
- Docker: Ensure Docker is installed and running on your system.
- Clone the Model_deploy repository:
$ git clone https://github.com/223MapAction/Model_deploy.git
- Change to the project directory:
$ cd Model_deploy
- Build and start the Docker containers:
$ docker-compose -f _cd_pipeline.yaml up --build
Once the containers are up and running, access the application at:
http://localhost:8001
Run the test suite using the command below:
$ docker-compose -f _ci_pipeline.yml up
Contributions are welcome! Here are several ways you can contribute:
- Report Issues: Submit bugs found or log feature requests for the
Model_deployproject. - Submit Pull Requests: Review open PRs, and submit your own PRs.
- Join the Discussions: Share your insights, provide feedback, or ask questions.
See our Contribution Guidelines for details on how to contribute.
This project is licensed under the GNU Affero General Public License v3.0. For more details, refer to the LICENSE file.
- List any resources, contributors, inspiration, etc. here.
See our Code of Conduct for details on expected behavior in our community.
This project aims to comply with the Digital Public Goods Standard. Below is a summary of our current compliance status:
- Relevance to Sustainable Development Goals: The project aligns with SDGs by promoting sustainable urban and environmental management through technology and data-driven insights.
- Use of Approved Open License: Compliant (AGPL-3.0 license)
- Clear Ownership: The project is owned and maintained by Map Action, ensuring accountability and continuous development.
- Platform Independence: The project is designed to be platform-independent, running on various operating systems and environments with Docker support.
- Documentation: Largely compliant (see README and developer documentation)
- Mechanism for Extracting Data: The project includes APIs and data processing pipelines to extract and analyze environmental data efficiently.
- Adherence to Privacy and Applicable Laws: The project follows best practices for data privacy and complies with relevant legal requirements to protect user data.
- Adherence to Standards & Best Practices: The project adheres to industry standards and best practices in software development, ensuring high-quality and reliable solutions.
- Do No Harm Assessment: The project undergoes regular assessments to ensure it does not negatively impact users or the environment.
For a detailed assessment and ongoing compliance efforts, please see our Digital Public Goods Standard Assessment.

