Files
syndarix/backend/app/celery_app.py
Felipe Cardoso 742ce4c9c8 fix: Comprehensive validation and bug fixes
Infrastructure:
- Add Redis and Celery workers to all docker-compose files
- Fix celery migration race condition in entrypoint.sh
- Add healthchecks and resource limits to dev compose
- Update .env.template with Redis/Celery variables

Backend Models & Schemas:
- Rename Sprint.completed_points to velocity (per requirements)
- Add AgentInstance.name as required field
- Rename Issue external tracker fields for consistency
- Add IssueSource and TrackerType enums
- Add Project.default_tracker_type field

Backend Fixes:
- Add Celery retry configuration with exponential backoff
- Remove unused sequence counter from EventBus
- Add mypy overrides for test dependencies
- Fix test file using wrong schema (UserUpdate -> dict)

Frontend Fixes:
- Fix memory leak in useProjectEvents (proper cleanup)
- Fix race condition with stale closure in reconnection
- Sync TokenWithUser type with regenerated API client
- Fix expires_in null handling in useAuth
- Clean up unused imports in prototype pages
- Add ESLint relaxed rules for prototype files

CI/CD:
- Add E2E testing stage with Testcontainers
- Add security scanning with Trivy and pip-audit
- Add dependency caching for faster builds

Tests:
- Update all tests to use renamed fields (velocity, name, etc.)
- Fix 14 schema test failures
- All 1500 tests pass with 91% coverage

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-30 10:35:30 +01:00

117 lines
3.9 KiB
Python

# app/celery_app.py
"""
Celery application configuration for Syndarix.
This module configures the Celery app for background task processing:
- Agent execution tasks (LLM calls, tool execution)
- Git operations (clone, commit, push, PR creation)
- Issue synchronization with external trackers
- Workflow state management
- Cost tracking and budget monitoring
Architecture:
- Redis as message broker and result backend
- Queue routing for task isolation
- JSON serialization for cross-language compatibility
- Beat scheduler for periodic tasks
"""
from celery import Celery
from app.core.config import settings
# Create Celery application instance
celery_app = Celery(
"syndarix",
broker=settings.celery_broker_url,
backend=settings.celery_result_backend,
)
# Define task queues with their own exchanges and routing keys
TASK_QUEUES = {
"agent": {"exchange": "agent", "routing_key": "agent"},
"git": {"exchange": "git", "routing_key": "git"},
"sync": {"exchange": "sync", "routing_key": "sync"},
"default": {"exchange": "default", "routing_key": "default"},
}
# Configure Celery
celery_app.conf.update(
# Serialization
task_serializer="json",
accept_content=["json"],
result_serializer="json",
# Timezone
timezone="UTC",
enable_utc=True,
# Task imports for auto-discovery
imports=("app.tasks",),
# Default queue
task_default_queue="default",
# Task queues configuration
task_queues=TASK_QUEUES,
# Task routing - route tasks to appropriate queues
task_routes={
"app.tasks.agent.*": {"queue": "agent"},
"app.tasks.git.*": {"queue": "git"},
"app.tasks.sync.*": {"queue": "sync"},
"app.tasks.*": {"queue": "default"},
},
# Time limits per ADR-003
task_soft_time_limit=300, # 5 minutes soft limit
task_time_limit=600, # 10 minutes hard limit
# Result expiration - 24 hours
result_expires=86400,
# Broker connection retry
broker_connection_retry_on_startup=True,
# Retry configuration per ADR-003 (built-in retry with backoff)
task_autoretry_for=(Exception,), # Retry on all exceptions
task_retry_kwargs={"max_retries": 3, "countdown": 5}, # Initial 5s delay
task_retry_backoff=True, # Enable exponential backoff
task_retry_backoff_max=600, # Max 10 minutes between retries
task_retry_jitter=True, # Add jitter to prevent thundering herd
# Beat schedule for periodic tasks
beat_schedule={
# Cost aggregation every hour per ADR-012
"aggregate-daily-costs": {
"task": "app.tasks.cost.aggregate_daily_costs",
"schedule": 3600.0, # 1 hour in seconds
},
# Reset daily budget counters at midnight UTC
"reset-daily-budget-counters": {
"task": "app.tasks.cost.reset_daily_budget_counters",
"schedule": 86400.0, # 24 hours in seconds
},
# Check for stale workflows every 5 minutes
"recover-stale-workflows": {
"task": "app.tasks.workflow.recover_stale_workflows",
"schedule": 300.0, # 5 minutes in seconds
},
# Incremental issue sync every minute per ADR-011
"sync-issues-incremental": {
"task": "app.tasks.sync.sync_issues_incremental",
"schedule": 60.0, # 1 minute in seconds
},
# Full issue reconciliation every 15 minutes per ADR-011
"sync-issues-full": {
"task": "app.tasks.sync.sync_issues_full",
"schedule": 900.0, # 15 minutes in seconds
},
},
# Task execution settings
task_acks_late=True, # Acknowledge tasks after execution
task_reject_on_worker_lost=True, # Reject tasks if worker dies
worker_prefetch_multiplier=1, # Fair task distribution
)
# Auto-discover tasks from task modules
celery_app.autodiscover_tasks(
[
"app.tasks.agent",
"app.tasks.git",
"app.tasks.sync",
"app.tasks.workflow",
"app.tasks.cost",
]
)