Files
orion/app/tasks/celery_tasks/base.py
Samir Boulahtit 2792414395 feat: add Celery/Redis task queue with feature flag support
Migrate background tasks from FastAPI BackgroundTasks to Celery with Redis
for persistent task queuing, retries, and scheduled jobs.

Key changes:
- Add Celery configuration with Redis broker/backend
- Create task dispatcher with USE_CELERY feature flag for gradual rollout
- Add Celery task wrappers for all background operations:
  - Marketplace imports
  - Letzshop historical imports
  - Product exports
  - Code quality scans
  - Test runs
  - Subscription scheduled tasks (via Celery Beat)
- Add celery_task_id column to job tables for Flower integration
- Add Flower dashboard link to admin background tasks page
- Update docker-compose.yml with worker, beat, and flower services
- Add Makefile targets: celery-worker, celery-beat, celery-dev, flower

When USE_CELERY=false (default), system falls back to FastAPI BackgroundTasks
for development without Redis dependency.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 17:35:16 +01:00

92 lines
2.3 KiB
Python

# app/tasks/celery_tasks/base.py
"""
Base Celery task class with database session management.
Provides a DatabaseTask base class that handles:
- Database session lifecycle (create/close)
- Context manager pattern for session usage
- Proper cleanup on task completion or failure
"""
import logging
from contextlib import contextmanager
from celery import Task
from app.core.database import SessionLocal
logger = logging.getLogger(__name__)
class DatabaseTask(Task):
"""
Base task with database session management.
Usage:
@celery_app.task(bind=True, base=DatabaseTask)
def my_task(self, arg1, arg2):
with self.get_db() as db:
# Use db session
result = db.query(Model).all()
return result
"""
abstract = True
@contextmanager
def get_db(self):
"""
Context manager for database session.
Yields a database session and ensures proper cleanup
on both success and failure.
Yields:
Session: SQLAlchemy database session
Example:
with self.get_db() as db:
vendor = db.query(Vendor).filter(Vendor.id == vendor_id).first()
"""
db = SessionLocal()
try:
yield db
except Exception as e:
logger.error(f"Database error in task {self.name}: {e}")
db.rollback()
raise
finally:
db.close()
def on_failure(self, exc, task_id, args, kwargs, einfo):
"""
Called when task fails.
Logs the failure with task details for debugging.
"""
logger.error(
f"Task {self.name}[{task_id}] failed: {exc}\n"
f"Args: {args}\n"
f"Kwargs: {kwargs}\n"
f"Traceback: {einfo}"
)
def on_success(self, retval, task_id, args, kwargs):
"""
Called when task succeeds.
Logs successful completion with task ID.
"""
logger.info(f"Task {self.name}[{task_id}] completed successfully")
def on_retry(self, exc, task_id, args, kwargs, einfo):
"""
Called when task is being retried.
Logs retry attempt with reason.
"""
logger.warning(
f"Task {self.name}[{task_id}] retrying due to: {exc}\n"
f"Retry count: {self.request.retries}"
)