feat: add Celery/Redis task queue with feature flag support

Migrate background tasks from FastAPI BackgroundTasks to Celery with Redis
for persistent task queuing, retries, and scheduled jobs.

Key changes:
- Add Celery configuration with Redis broker/backend
- Create task dispatcher with USE_CELERY feature flag for gradual rollout
- Add Celery task wrappers for all background operations:
  - Marketplace imports
  - Letzshop historical imports
  - Product exports
  - Code quality scans
  - Test runs
  - Subscription scheduled tasks (via Celery Beat)
- Add celery_task_id column to job tables for Flower integration
- Add Flower dashboard link to admin background tasks page
- Update docker-compose.yml with worker, beat, and flower services
- Add Makefile targets: celery-worker, celery-beat, celery-dev, flower

When USE_CELERY=false (default), system falls back to FastAPI BackgroundTasks
for development without Redis dependency.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2026-01-11 17:35:16 +01:00
parent 879ac0caea
commit 2792414395
30 changed files with 2218 additions and 79 deletions

View File

@@ -213,14 +213,24 @@ async def trigger_scan(
scan_jobs = []
triggered_by = f"manual:{current_user.username}"
# Import dispatcher for Celery support
from app.tasks.dispatcher import task_dispatcher
for vtype in request.validator_types:
# Create scan record with pending status via service
scan = code_quality_service.create_pending_scan(
db, validator_type=vtype.value, triggered_by=triggered_by
)
# Queue background task
background_tasks.add_task(execute_code_quality_scan, scan.id)
# Dispatch via task dispatcher (supports Celery or BackgroundTasks)
celery_task_id = task_dispatcher.dispatch_code_quality_scan(
background_tasks=background_tasks,
scan_id=scan.id,
)
# Store Celery task ID if using Celery
if celery_task_id:
scan.celery_task_id = celery_task_id
scan_jobs.append(
ScanJobResponse(