A production-ready, containerized N8N platform designed for building and deploying automation services at scale. This platform supports multi-tenant architecture, extensive workflow templates, and enterprise-grade features for automation service providers.
- Multi-Tenant Architecture: Logical tenant isolation with per-client data separation
- Containerized Deployment: Docker Compose orchestration with health checks
- Production Ready: Resource limits, logging, monitoring, and backup systems
- Auto-Recovery: Built-in health monitoring with automatic service recovery
- Scalable Storage: PostgreSQL database with Redis for queue management
- Web Scraping: Crawl4AI integration for intelligent content extraction
- Content Generation: Multi-provider AI support (Gemini, Cohere, Mistral)
- Smart Automation: AI-enhanced workflow templates for various industries
- π Social Media Automation: Multi-platform posting and analytics
- π± WhatsApp Business: AI-powered customer support automation
- π E-commerce Integration: Price monitoring and product synchronization
- π§ Email Marketing: Automated campaigns and analytics reports
- π Web Scraping: AI-powered data extraction at scale
- πΌ ERP Integration: Odoo multi-tenant business process automation
- Docker Desktop 4.0+
- 8GB RAM minimum (16GB recommended for production)
- 20GB available disk space
git clone https://github.com/your-username/automation-platform.git
cd automation-platform
cp .env.example .envEdit .env file with your settings:
# Required: Generate secure keys
N8N_ENCRYPTION_KEY=$(openssl rand -base64 32)
POSTGRES_PASSWORD=$(openssl rand -base64 16)
REDIS_PASSWORD=$(openssl rand -base64 16)
SESSION_SECRET=$(openssl rand -base64 32)
# Basic configuration for local development
N8N_HOST=localhost
N8N_PROTOCOL=http
WEBHOOK_URL=http://localhost:5678# Development environment
docker compose up -d
# Production environment
docker compose -f docker-compose.yml -f docker-compose.prod.yml up -d- Web Interface: http://localhost:5678
- Health Check: http://localhost:5678/healthz
- Crawl4AI Service: http://localhost:11235
The Crawl4AI service provides a powerful web scraping playground that you can access immediately after starting the platform. The service demonstrates AI-powered content extraction with streaming capabilities.
Crawl4AI playground showing successful web scraping with streaming enabled and fast response times
Key Features Demonstrated:
- Interactive Playground: Test web scraping at
http://localhost:11235/playground/ - Streaming Response: Real-time content extraction with
stream=True - Cache Management: Flexible caching with
CacheMode.BYPASSfor fresh data - Fast Performance: Typical response times under 1500ms
- Rich Content Extraction: Complete HTML and structured data extraction
N8N automation platform running on Docker with all services healthy
Service Health Status:
- β N8N: Running on port 5678 with full workflow capabilities
- β PostgreSQL: Database connection established and healthy
- β Redis: Queue management and session storage active
- β Crawl4AI: Web scraping service running on port 11235
-
Start the Platform:
docker compose up -d
-
Verify All Services:
# Check N8N health curl http://localhost:5678/healthz # Check Crawl4AI service curl http://localhost:11235/health # View container status docker compose ps
-
Test Integration:
- Access N8N interface: http://localhost:5678
- Test Crawl4AI playground: http://localhost:11235/playground/
- Import workflow templates from
./workflows/templates/
- N8N: Workflow automation engine (port 5678)
- PostgreSQL: Primary database for workflows and executions
- Redis: Session management and queue processing
- Crawl4AI: AI-powered web scraping service (port 11235)
Webhook Request β Tenant Validation β N8N Workflow β External APIs
β
AWS Secrets Manager
- Internal Network: Secure container communication
- Health Monitoring: Automated recovery and alerting
- Data Isolation: Tenant-specific data separation
Perfect for agencies managing multiple client accounts:
- Content Generation: AI-powered trending content creation
- Multi-Platform Posting: Instagram, Facebook, LinkedIn automation
- Analytics Reporting: Weekly performance reports via email
- Hashtag Optimization: Trending topic integration
Comprehensive e-commerce workflow collection:
- Price Monitoring: Intelligent competitor price tracking
- Product Synchronization: Multi-platform inventory management
- Customer Support: WhatsApp Business automation with AI
- Order Processing: Automated fulfillment workflows
Enterprise-grade business workflows:
- ERP Integration: Odoo multi-tenant business automation
- Email Campaigns: Automated marketing sequences
- Data Processing: AI-enhanced web scraping and analysis
- Reporting: Automated business intelligence reports
- Client Onboarding: Rapid deployment of client-specific workflows
- Service Scaling: Multi-tenant platform supporting hundreds of clients
- White-label Solutions: Customizable branding and domain configuration
- Revenue Optimization: Usage tracking and billing integration
- API Integration: Connect disparate business systems
- Workflow Marketplace: Pre-built industry-specific templates
- Customer Success: Automated onboarding and support workflows
- Data Synchronization: Real-time business data management
- Internal Automation: Streamline business processes
- Integration Hub: Connect existing business tools
- Compliance Automation: Audit trails and data governance
- Cost Reduction: Reduce manual operational overhead
Key configuration options in .env:
# Core N8N Settings
N8N_ENCRYPTION_KEY=your-32-character-key
N8N_HOST=your-domain.com
WEBHOOK_URL=https://your-domain.com
# Database Configuration
POSTGRES_PASSWORD=secure-password
REDIS_PASSWORD=secure-password
# AI Provider Keys (Optional)
GEMINI_API_KEY=your-gemini-key
COHERE_API_KEY=your-cohere-key
MISTRAL_API_KEY=your-mistral-key
# AWS Integration (Optional)
AWS_ACCESS_KEY_ID=your-aws-key
AWS_SECRET_ACCESS_KEY=your-aws-secretEach tenant requires:
- Database Entry: Tenant configuration in
tenant_configtable - Webhook Endpoints: Pattern:
/webhook/{tenant-id}-{service} - Credential Management: Secure storage for API keys and tokens
For production environments:
- Use
docker-compose.prod.ymloverlay - Configure SSL/TLS termination
- Set up monitoring and alerting
- Implement backup strategies
- Configure log aggregation
Automated service monitoring with recovery:
# Start health monitor
./scripts/monitoring/n8n-health-monitor.sh start
# Check status
./scripts/monitoring/n8n-health-monitor.sh status
# Manual health check
./scripts/monitoring/n8n-health-monitor.sh checkComprehensive backup solution:
# Full platform backup
./scripts/backup/backup-platform.sh
# Tenant-specific backup
./scripts/backup/backup-platform.sh --tenant client-id
# Workflow-only backup
./scripts/backup/backup-platform.sh backup-workflows- Resource Limits: Container memory and CPU constraints
- Database Tuning: Optimized PostgreSQL configuration
- Cache Management: Redis-based session and data caching
- Log Rotation: Automated log management and cleanup
- Tenant Isolation: Logical separation of client data
- Credential Security: External credential storage support
- Audit Logging: Comprehensive activity tracking
- Rate Limiting: API and webhook protection
- Authentication: Basic auth and JWT token support
- Authorization: Role-based access control
- Network Security: Container network isolation
- SSL/TLS: Production-ready encryption
- Access N8N web interface at http://localhost:5678
- Navigate to Workflows β Import from File
- Select templates from
./workflows/templates/ - Configure tenant-specific settings and credentials
All workflow templates have been tested and cleaned up to ensure proper loading and functionality:
-
AI-Powered Web Scraper:
ai-web-scraper-automation.jsonβ- Integrates with Crawl4AI service for intelligent content extraction
- Multi-tenant webhook support with validation
- AI-enhanced data processing capabilities
-
Email Summary Automation:
email-summary-automation.jsonβ- Automated email report generation
- Template-based content formatting
- Scheduled execution support
-
Enterprise Web Crawler:
enterprise-web-crawler.jsonβ- Large-scale web scraping for enterprise use
- Advanced data extraction and processing
- Rate limiting and error handling
-
Hybrid Price Monitoring:
hybrid-price-monitoring.jsonβ- Multi-source price tracking automation
- Alert system integration
- Historical data management
-
Social Media Posting:
social-media-posting.jsonβ- Multi-platform content distribution
- Scheduled posting capabilities
- Analytics integration support
Note: These templates have undergone cleanup and verification to ensure they load properly in N8N without configuration errors.
Configure webhooks using the pattern:
http://localhost:5678/webhook/{tenant-id}-{service}
Example endpoints:
- Content posting:
/webhook/demo-content - Price monitoring:
/webhook/demo-price-update - AI scraping:
/webhook/ai-scraper
- Health monitoring and auto-recovery setup
- Backup and disaster recovery procedures
- Performance tuning and optimization
- Security best practices and compliance
We welcome contributions! Please read our contributing guidelines:
- Fork the Repository: Create your feature branch
- Follow Conventions: Maintain code style and documentation standards
- Test Thoroughly: Ensure all workflows and scripts function correctly
- Submit Pull Requests: Include detailed descriptions and test results
# Clone repository
git clone https://github.com/your-username/automation-platform.git
cd automation-platform
# Set up development environment
cp .env.example .env
# Configure .env for development
# Start in development mode
docker compose up -d
# Access logs
docker compose logs -f n8nThis project is licensed under the MIT License - see the LICENSE file for details.
- GitHub Issues: Bug reports and feature requests
- Discussions: Community Q&A and best practices
- Wiki: Extended documentation and tutorials
For enterprise deployments and custom development:
- Custom workflow development
- Infrastructure consulting
- Training and onboarding services
- SLA-backed production support
- N8N - The core workflow automation platform
- Crawl4AI - AI-powered web scraping
- Traefik - Modern reverse proxy for production deployments
- Kubernetes Support: Helm charts for container orchestration
- Advanced Analytics: Business intelligence dashboards
- Marketplace Integration: Community workflow sharing
- Enhanced AI: More AI providers and capabilities
- Mobile App: Mobile dashboard for monitoring and management
- v1.0: Initial release with core multi-tenant features
- v1.1: AI integration and enhanced workflow templates
- v1.2: Production hardening and monitoring improvements
- v2.0: Kubernetes support and advanced analytics (planned)
Made with β€οΈ for the automation community
Start building powerful automation solutions today with our comprehensive, production-ready platform!