Skip to content

223MapAction/Model_deploy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

project-logo

Model_deploy

FastAPI wrapper for Map Action Model deployment.

License Last Commit Top Language Language Count

Developed with the software and tools below.

Streamlit Pydantic YAML OpenAI Celery
Python Docker GitHub Actions Pytest FastAPI


Table of Contents

Overview

Model_deploy is a versatile open-source project designed for seamless deployment and scalable management of machine learning models. Leveraging FastAPI, Celery, and Transformers, it offers robust features such as asynchronous task management, context building, and image classification. With Dockerized environments, CI/CD pipelines, and PostgreSQL integration, Model_deploy ensures efficient ML deployment with enhanced maintainability and scalability. This project's value proposition lies in simplifying ML model deployment processes while enabling reliable and performance-driven AI applications. Developer Documentation


System Architecture

The system uses FastAPI endpoints to make predictions using computer vision models and generate summaries using a Large Language Model (LLM).

System Architecture Diagram


How it works

  1. The user submits an image through the FastAPI endpoint.
  2. The image is processed by a Convolutional Neural Network (CNN) for classification.
  3. Based on the CNN output, relevant context is retrieved from the database.
  4. The LLM generates a summary using the image classification and retrieved context.
  5. The results are returned to the user and stored in the database.

Features

Feature Description
βš™οΈ Architecture Utilizes FastAPI for API creation, Celery for asynchronous tasks, and PostgreSQL for database operations.
πŸ–ΌοΈ Image Processing Implements CNN-based image classification for environmental issue detection.
🧠 LLM Integration Incorporates Large Language Models for context-aware summarization and response generation.
πŸ”Œ Integrations Connects with OpenAI models, uses Redis for Celery task queuing, and implements GitHub Actions for CI/CD.
🧩 Modularity Features a modular codebase with separate services for CNN processing, LLM operations, and API handling.
πŸ›‘οΈ Security Implements secure practices for handling sensitive information and maintains Docker secrets.
πŸ“¦ Dependencies Relies on key libraries including FastAPI, Celery, Transformers, PyTorch, and PostgreSQL for robust ML model deployment.

Repository Structure

└── Model_deploy/
    β”œβ”€β”€ .github
    β”‚   └── workflows
    β”œβ”€β”€ app
    β”‚   β”œβ”€β”€ apis
    β”‚   β”œβ”€β”€ models
    β”‚   └── services
    β”‚       β”œβ”€β”€ analysis
    β”‚       β”œβ”€β”€ celery
    β”‚       β”œβ”€β”€ cnn
    β”‚       β”œβ”€β”€ llm
    β”‚       └── websockets
    β”œβ”€β”€ config
    β”‚   └── redis
    β”œβ”€β”€ cv_model
    β”œβ”€β”€ docs
    β”œβ”€β”€ documents
    β”œβ”€β”€ test
    β”‚   β”œβ”€β”€ test_apis
    β”‚   β”œβ”€β”€ test_celery
    β”‚   β”œβ”€β”€ test_cnn
    β”‚   β”œβ”€β”€ test_database
    β”‚   β”œβ”€β”€ test_image_model
    β”‚   └── test_llm
    β”œβ”€β”€ vector_index
    β”œβ”€β”€ Dockerfile
    β”œβ”€β”€ Dockerfile.CI
    β”œβ”€β”€ Dockerfile.Celery
    β”œβ”€β”€ LICENSE
    β”œβ”€β”€ README.md
    β”œβ”€β”€ _cd_pipeline.yaml
    β”œβ”€β”€ _ci_pipeline.yml
    β”œβ”€β”€ mkdocs.yml
    β”œβ”€β”€ pytest.ini
    β”œβ”€β”€ requirements.txt

This structure highlights the main components of the Model_deploy project:

  • app/: Contains the core application code
    • apis/: API endpoints
    • models/: Data models
    • services/: Various services including CNN, LLM, and Celery tasks
  • config/: Configuration files
  • cv_model/: Computer Vision model files
  • docs/: Documentation files
  • documents/: Document storage for LLM processing
  • test/: Test suites for different components
  • vector_index/: Vector storage for LLM
  • Docker-related files for containerization
  • CI/CD pipeline configurations
  • Documentation and licensing files

Modules

.
File Summary
requirements.txt Lists Python package dependencies in requirements.txt for seamless project setup and reproducibility. Key libraries include fastapi, celery, transformers, and uvicorn to support ML deployment. Enhances project scalability and maintainability by managing package versions efficiently.
Dockerfile.Celery Builds a Docker image for Celery worker, leveraging Python 3.10.13, to manage asynchronous tasks in the Model_deploy project. Inherits project dependencies from requirements.txt while ensuring a streamlined environment setup for seamless task execution.
Dockerfile Enables deploying a Python application using Uvicorn server, handling data processing requests. Utilizes Docker for portability, installs dependencies, and configures the execution environment. Dynamically serves the app on port 8001 in the container.
Dockerfile.CI Builds Python environment, installs project dependencies, and runs test coverage using pytest in the CI pipeline for Model_deploy.
_cd_pipeline.yaml Sets up Docker services for a FastAPI app, Redis, and Celery workers with networking configurations in a micro-services environment. Enables communication between services for seamless deployment and scalability.
_ci_pipeline.yml Automates creation and configuration of a CI service within the Model_deploy repository. Orchestrates building a Docker container for testing purposes based on the specified Dockerfile.CI. Integrates environment variables for seamless deployment.
app
File Summary
main.py Initializes FastAPI app with CORS middleware.-Connects to the database on app startup.-Gracefully disconnects from the database on app shutdown.-Includes main_routers APIs under /api1 prefix.
database.py Establishes a connection to a PostgreSQL database within the Model_deploy repo's app module. Leveraging the databases library, it initializes a database instance with a predefined URL for subsequent data operations across the ML deployment system.
app.services.llm
File Summary
llm.py Contains core LLM functionality
app.services.cnn
File Summary
cnn_preprocess.py Implements image preprocessing for the CNN model, including resizing and normalization.
cnn.py Contains the main CNN prediction logic, using a pre-trained VGG16 model to classify environmental issues in images.
cnn_model.py Defines the CNN model architecture, adapting VGG16 for the specific classification task.
app.apis
File Summary
main_router.py Handles image prediction, contextualization, and data insertion. Utilizes FastAPI, requests, and Celery for async tasks. Fetches images, processes predictions, and stores results in the Mapapi_prediction table. Resilient to exceptions with proper error handling.
app.models
File Summary
image_model.py Defines ImageModel with image_name, sensitive_structures, and incident_id attributes.
test.apis
File Summary
test_main_router.py Verifies FastAPI endpoint functionality by simulating HTTP requests to ensure the Index route returns a 200 status code and correct JSON response.
.github.workflows
File Summary
ci_cd.yml Handles continuous integration and deployment via GitHub Actions. Runs test suites to validate code quality, builds and pushes Docker images, and deploys the ML model API on master branch pushes. Includes secret handling for Docker Hub credentials.

Getting Started

System Requirements:

  • Docker: Ensure Docker is installed and running on your system.

Installation

Using Docker

  1. Clone the Model_deploy repository:
$ git clone https://github.com/223MapAction/Model_deploy.git
  1. Change to the project directory:
$ cd Model_deploy
  1. Build and start the Docker containers:
$ docker-compose -f _cd_pipeline.yaml up --build

Usage

Access the Application

Once the containers are up and running, access the application at:

http://localhost:8001

Tests

Run the test suite using the command below:

$ docker-compose -f _ci_pipeline.yml up

Contributing

Contributions are welcome! Here are several ways you can contribute:

See our Contribution Guidelines for details on how to contribute.

Contributor Graph


License

This project is licensed under the GNU Affero General Public License v3.0. For more details, refer to the LICENSE file.


Acknowledgments

  • List any resources, contributors, inspiration, etc. here.

Return


Code of Conduct

See our Code of Conduct for details on expected behavior in our community.


Digital Public Goods Standard Compliance

This project aims to comply with the Digital Public Goods Standard. Below is a summary of our current compliance status:

  1. Relevance to Sustainable Development Goals: The project aligns with SDGs by promoting sustainable urban and environmental management through technology and data-driven insights.
  2. Use of Approved Open License: Compliant (AGPL-3.0 license)
  3. Clear Ownership: The project is owned and maintained by Map Action, ensuring accountability and continuous development.
  4. Platform Independence: The project is designed to be platform-independent, running on various operating systems and environments with Docker support.
  5. Documentation: Largely compliant (see README and developer documentation)
  6. Mechanism for Extracting Data: The project includes APIs and data processing pipelines to extract and analyze environmental data efficiently.
  7. Adherence to Privacy and Applicable Laws: The project follows best practices for data privacy and complies with relevant legal requirements to protect user data.
  8. Adherence to Standards & Best Practices: The project adheres to industry standards and best practices in software development, ensuring high-quality and reliable solutions.
  9. Do No Harm Assessment: The project undergoes regular assessments to ensure it does not negatively impact users or the environment.

For a detailed assessment and ongoing compliance efforts, please see our Digital Public Goods Standard Assessment.

About

Description to come.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages