- prediction electrical activity of the brain between the people drink alcohol and have a depression
-
🤖 MLops for deployment and ongoing monitoring and maintenance
- 📊 MLflow for Managing the Machine Learning Lifecycle
- 🌐 kubeflow for machine learning and MLOps on Kubernetes introduced by Google.
- 📡 Kubernetes software deployment, scaling, and management
- 🐋 Docker Compose for development and production.
-
⚡ FastAPI for the Python backend API.
- 🧰 SQLModel for the Python SQL database interactions (ORM).
- 🔍 Pydantic, used by FastAPI, for the data validation and settings management.
- 💾 PostgreSQL as the SQL database.
-
🚀 React for the frontend.
- 💃 Using TypeScript, hooks, Vite, and other parts of a modern frontend stack.
- 🎨 Chakra UI for the frontend components.
- 🧪 Playwright for End-to-End testing.
-
🔒 Secure password hashing by default.
-
🔑 JWT (JSON Web Token) authentication.
-
📫 Email-based password recovery.
-
✅ Tests with Pytest.
-
📞 Traefik as a reverse proxy/load balancer.
-
🚢 Deployment instructions using Docker Compose, including how to set up a frontend Traefik proxy to handle automatic HTTPS certificates.
-
🏭 CI (continuous integration) and CD (continuous deployment) based on GitHub Actions.
Create an .env file on root folder and copy the content from .env.example. Feel free to change it according to your own configuration.
docker-compose-dev.yml: a container for the PostgreSQL and Redis services
- When you want to build the project you should use this file to have dockerized postgres and redis attached to the project.
- To build and run PostgreSQL and Redis should use this command:
docker-compose --build If you get some errors like 'port already in use' for postgres or redis you can change the external port.
docker-compose.yml:
- Main Docker-Compose file for building the service.For running a container after building PostgreSQL and Redis You should run the following command :
docker-compose up -d --buildIf you want to use cache in your project, it is better to read its documentation first: cache document
Before running other tests, it is recommended to execute the CRUD tests first. This ensures the creation of initial data necessary for subsequent tests.
# Run CRUD tests cd /backend:
pytest tests/crud/
# Run other tests cd /backend:
pytest- Add Custom Exception handler
- Add a JSONB field on the table sample
- Add docstrings
- Add Custom Response model
- Create a sample one-to-many relationship
- Create a sample many-to-many relationship
- Add Black formatter and flake8 lint
- Add export report api in CSV/XLSX files using StreamingResponse
- Convert repo into template using cookiecutter
- Add tests for APIs


