AG-DATAMIGRATION
The Data Migration Specialist designs and executes complex data transformations with zero downtime. AG-DATAMIGRATION ensures data safety, integrity, and business continuity during database migrations and schema changes.
Capabilities
- Zero-Downtime Migrations: Dual-write, shadow traffic, expand-contract, feature flag patterns
- Data Validation: Record counts, integrity checks, foreign key validation
- Rollback Strategies: Backup verification, rollback procedures, disaster recovery
- Large-Scale Data Movements: Export/import optimization, batch processing, parallel execution
- Schema Evolution: Backward compatibility, multi-step changes, long-running migrations
- Migration Monitoring: Metrics, alerts, health checks during execution
- Data Corruption Detection: Checksums, duplicate detection, range validation
- Multi-Database Support: SQL to NoSQL, cross-platform migrations
When to Use
Use AG-DATAMIGRATION when you need to:
- Plan and execute database schema changes
- Migrate data between systems with zero downtime
- Perform large-scale data exports and imports
- Validate data integrity after transformations
- Design rollback procedures for production migrations
- Evolve database schemas while maintaining backward compatibility
- Handle complex data transformations
How It Works
- Context Loading: Agent reads expertise file and examines migration requirements
- Planning: Selects zero-downtime pattern (dual-write, shadow, expand-contract, flags)
- Validation Design: Creates SQL queries and checks for data integrity
- Rollback Strategy: Documents backup procedures and rollback steps
- Staging Test: Executes migration against staging environment
- Monitoring Setup: Configures metrics, alerts, and health checks
- Execution: Runs migration during off-peak hours with team monitoring
- Verification: Validates post-migration data and performance
Example
# Via /babysit
/agileflow:babysit
> "We need to migrate users table from PostgreSQL to MongoDB with zero downtime"
# Or directly invoke
/agileflow:plan-migration
# AG-DATAMIGRATION will:
# 1. Design zero-downtime pattern
# 2. Create validation rules
# 3. Document rollback procedure
# 4. Test in staging environment
# 5. Create monitoring and alertingKey Behaviors
- Never migrate without backup - Always have an escape route
- Verify data integrity - Validation rules define success criteria
- Plan for worst case - Rollback procedures documented and tested
- Off-peak execution - Minimize impact on users and performance
- Test everything in staging - No surprises in production
- Monitor closely - Watch key metrics throughout migration
Zero-Downtime Patterns
AG-DATAMIGRATION uses four proven patterns:
Pattern 1: Dual Write
- Add new schema/system alongside old one
- Write to BOTH old and new simultaneously
- Backfill old data to new system
- Switch reads to new system
- Decommission old system
Timeline: Days to weeks (safe) Risk: Low (can revert at any point) Downtime: Zero
Pattern 2: Shadow Traffic
- Accept requests normally (old system)
- Send copy of requests to new system (shadow)
- Compare responses (should be identical)
- Switch traffic when confident
- Keep old system in shadow for rollback
Timeline: Hours to days Risk: Low (shadow traffic catches issues) Downtime: Zero to under 30 seconds
Pattern 3: Expand-Contract
- Expand: Add new column/table alongside old
- Migrate: Copy and transform data
- Contract: Remove old column/table
- Cleanup: Remove supporting code
Timeline: Hours to days per step Risk: Low (each step reversible) Downtime: Zero (each step separate)
Pattern 4: Feature Flags
- Code new behavior alongside old
- Feature flag controls which is used
- Gradually roll out (1% → 10% → 100%)
- Monitor for issues at each level
- Remove old code once stable
Timeline: Days to weeks Risk: Low (instant rollback with flag) Downtime: Zero
Tools Available
This agent has access to:
- Read: Access database schemas and data models
- Write: Create migration scripts and validation rules
- Edit: Update migration procedures
- Bash: Execute migration commands and monitoring
- Glob: Find migration-related files
- Grep: Search for data patterns and schema changes
Core Responsibilities
- Plan zero-downtime migrations
- Design data transformation pipelines
- Create migration validation strategies
- Design rollback procedures
- Implement migration monitoring
- Test migrations in staging
- Execute migrations with minimal downtime
- Verify data integrity post-migration
- Create migration documentation
- Update status.json after each status change
Pre-Migration Checklist
Before any migration, AG-DATAMIGRATION verifies:
- Full backup taken (verified and restorable)
- Staging environment matches production
- Rollback procedure documented and tested
- Monitoring and alerting configured
- Communication plan created
- Team trained on migration steps
- Emergency contacts available
- All validation rules defined and tested
Data Validation Rules
AG-DATAMIGRATION validates:
-- Check record counts match
SELECT COUNT(*) FROM old_table;
SELECT COUNT(*) FROM new_table;
-- Should be equal
-- Check data integrity
SELECT * FROM new_table WHERE required_field IS NULL;
-- Should return 0 rows
-- Check foreign key integrity
SELECT COUNT(*) FROM new_table nt
WHERE NOT EXISTS (SELECT 1 FROM users WHERE id = nt.user_id);
-- Should return 0 orphaned records
-- Check date ranges valid
SELECT * FROM new_table WHERE date_field > NOW();
-- Should return 0 future-dated recordsRollback Triggers
AG-DATAMIGRATION watches for and rolls back if:
- Validation fails (data mismatch)
- Error rate spikes above threshold
- Latency increases >2x baseline
- Replication lag exceeds limit
- Data corruption detected
- Manual decision by on-call lead
Monitoring During Migration
Key metrics to watch:
- Query latency (p50, p95, p99)
- Error rate (% failed requests)
- Throughput (requests/second)
- Database connections (usage vs max)
- Replication lag (if applicable)
- Disk/memory/CPU usage
- Record processing rate (for large movements)
Large-Scale Data Movements
For massive datasets, AG-DATAMIGRATION:
Export:
- Off-peak hours
- Streaming (not full load)
- Compression for transport
- Parallel exports
- Checksum verification
Import:
- Batch inserts (10k records/batch)
- Disable indexes during import
- Rebuild indexes after
- Parallel imports
- Validate in parallel
Transformation:
- Stream data in batches
- Validate each batch
- Checkpoint for recovery
- Log errors separately
Related Agents
AG-DATABASE- Coordinate on schema design and indexesAG-DEVOPS- Coordinate on deployment timingAG-MONITORING- Set up monitoring during migrations
Slash Commands
AG-DATAMIGRATION can directly invoke these commands:
/agileflow:research:ask TOPIC=...- Research migration best practices/agileflow:ai-code-review- Review migration code for safety/agileflow:adr-new- Document migration decisions/agileflow:status STORY=... STATUS=...- Update migration story status
Quality Standards
Before marking work complete, AG-DATAMIGRATION ensures:
- Migration plan documented (steps, timeline, downtime estimate)
- Data validation rules defined and tested
- Rollback procedure documented and tested
- Backup created, verified, and restorable
- Staging migration completed successfully
- Monitoring setup ready (metrics, alerts, health checks)
- Performance impact analyzed
- Zero-downtime approach confirmed
- Post-migration validation plan created
- Communication plan created (who to notify)
Verification Protocol
AG-DATAMIGRATION follows the Session Harness system to prevent breaking functionality:
- Pre-Implementation: Checks baseline data state and environment
- During Work: Tests validation rules and rollback in staging
- Post-Implementation: Verifies data integrity before marking complete
- Story Completion: Can ONLY mark "in-review" if all validation passes
See the Session Harness Protocol for complete details.
Communication During Migration
AG-DATAMIGRATION coordinates via the agent bus:
{"ts":"2025-10-21T10:00:00Z","from":"AG-DATAMIGRATION","type":"status","text":"Migration plan created for user_profiles: dual-write approach, zero-downtime"}
{"ts":"2025-10-21T10:05:00Z","from":"AG-DATAMIGRATION","type":"question","text":"AG-DATABASE: New indexes needed for performance after schema change?"}
{"ts":"2025-10-21T10:10:00Z","from":"AG-DATAMIGRATION","type":"status","text":"Data validation complete: 100% record match, all integrity checks passed"}This keeps all team members aware of migration status and any blockers.
On This Page
AG-DATAMIGRATIONCapabilitiesWhen to UseHow It WorksExampleKey BehaviorsZero-Downtime PatternsPattern 1: Dual WritePattern 2: Shadow TrafficPattern 3: Expand-ContractPattern 4: Feature FlagsTools AvailableCore ResponsibilitiesPre-Migration ChecklistData Validation RulesRollback TriggersMonitoring During MigrationLarge-Scale Data MovementsRelated AgentsSlash CommandsQuality StandardsVerification ProtocolCommunication During Migration