feat(context): Phase 7 - Main Engine & Integration #85

Closed
opened 2026-01-04 00:52:21 +00:00 by cardosofelipe · 0 comments

Overview

Implement the main ContextEngine class and integrate with MCPClientManager.

Parent Issue

  • #61: Context Management Engine

Implementation Tasks

1. Create engine.py

  • Create ContextEngine class
  • Initialize all components (calculator, scorer, ranker, compressor, formatter, cache)
  • Implement assemble_context() main method
  • Implement _fetch_knowledge() for RAG integration
  • Implement _convert_history() for conversation context
class ContextEngine:
    """Main context management engine."""

    def __init__(
        self,
        mcp_manager: MCPClientManager,
        redis: Redis | None,
        settings: ContextSettings | None = None
    ):
        self.mcp = mcp_manager
        self.settings = settings or ContextSettings()

        # Initialize components
        self.calculator = TokenCalculator(mcp_manager)
        self.scorer = CompositeScorer()
        self.ranker = ContextRanker()
        self.compressor = Compressor()
        self.formatter = ContextFormatter()
        self.cache = ContextCache(redis, self.settings) if redis else None

        self.pipeline = ContextPipeline(...)

    async def assemble_context(
        self,
        project_id: str,
        agent_id: str,
        query: str,
        model: str,
        max_tokens: int,
        system_prompt: str | None = None,
        conversation_history: list[dict] | None = None,
        knowledge_query: str | None = None,
        task_description: str | None = None
    ) -> AssembledContext:
        """Assemble optimized context for an LLM request."""
        ...

2. Create FastAPI Dependency

  • Create get_context_engine() dependency
  • Handle Redis optionality
  • Implement singleton pattern
# backend/app/api/dependencies/context.py
_engine: ContextEngine | None = None
_lock = asyncio.Lock()

async def get_context_engine(
    mcp: MCPClientManager = Depends(get_mcp_manager),
    redis: Redis = Depends(get_redis)
) -> ContextEngine:
    global _engine
    if _engine is None:
        async with _lock:
            if _engine is None:
                _engine = ContextEngine(mcp, redis)
    return _engine

3. Update __init__.py

  • Export ContextEngine as main public API
  • Export AssembledContext
  • Export ContextSettings

4. Integration with MCPClientManager

  • Add context_engine property to MCPClientManager
  • Ensure proper initialization order

Files to Create/Modify

backend/app/services/context/
├── engine.py                  # NEW
└── __init__.py                # UPDATE

backend/app/api/dependencies/
└── context.py                 # NEW

Acceptance Criteria

  • ContextEngine assembles context correctly
  • Knowledge Base integration works (RAG)
  • Conversation history is converted properly
  • FastAPI dependency provides singleton
  • Caching is enabled when Redis available
  • Integration tests pass

Dependencies

  • #79 (Phase 1 - Foundation)
  • #80 (Phase 2 - Token Budget)
  • #81 (Phase 3 - Scoring & Ranking)
  • #82 (Phase 4 - Assembly Pipeline)
  • #83 (Phase 5 - Model Adapters)
  • #84 (Phase 6 - Caching Layer)

Labels

phase-2, context, backend

## Overview Implement the main ContextEngine class and integrate with MCPClientManager. ## Parent Issue - #61: Context Management Engine --- ## Implementation Tasks ### 1. Create `engine.py` - [ ] Create `ContextEngine` class - [ ] Initialize all components (calculator, scorer, ranker, compressor, formatter, cache) - [ ] Implement `assemble_context()` main method - [ ] Implement `_fetch_knowledge()` for RAG integration - [ ] Implement `_convert_history()` for conversation context ```python class ContextEngine: """Main context management engine.""" def __init__( self, mcp_manager: MCPClientManager, redis: Redis | None, settings: ContextSettings | None = None ): self.mcp = mcp_manager self.settings = settings or ContextSettings() # Initialize components self.calculator = TokenCalculator(mcp_manager) self.scorer = CompositeScorer() self.ranker = ContextRanker() self.compressor = Compressor() self.formatter = ContextFormatter() self.cache = ContextCache(redis, self.settings) if redis else None self.pipeline = ContextPipeline(...) async def assemble_context( self, project_id: str, agent_id: str, query: str, model: str, max_tokens: int, system_prompt: str | None = None, conversation_history: list[dict] | None = None, knowledge_query: str | None = None, task_description: str | None = None ) -> AssembledContext: """Assemble optimized context for an LLM request.""" ... ``` ### 2. Create FastAPI Dependency - [ ] Create `get_context_engine()` dependency - [ ] Handle Redis optionality - [ ] Implement singleton pattern ```python # backend/app/api/dependencies/context.py _engine: ContextEngine | None = None _lock = asyncio.Lock() async def get_context_engine( mcp: MCPClientManager = Depends(get_mcp_manager), redis: Redis = Depends(get_redis) ) -> ContextEngine: global _engine if _engine is None: async with _lock: if _engine is None: _engine = ContextEngine(mcp, redis) return _engine ``` ### 3. Update `__init__.py` - [ ] Export ContextEngine as main public API - [ ] Export AssembledContext - [ ] Export ContextSettings ### 4. Integration with MCPClientManager - [ ] Add `context_engine` property to MCPClientManager - [ ] Ensure proper initialization order --- ## Files to Create/Modify ``` backend/app/services/context/ ├── engine.py # NEW └── __init__.py # UPDATE backend/app/api/dependencies/ └── context.py # NEW ``` --- ## Acceptance Criteria - [ ] ContextEngine assembles context correctly - [ ] Knowledge Base integration works (RAG) - [ ] Conversation history is converted properly - [ ] FastAPI dependency provides singleton - [ ] Caching is enabled when Redis available - [ ] Integration tests pass --- ## Dependencies - #79 (Phase 1 - Foundation) - #80 (Phase 2 - Token Budget) - #81 (Phase 3 - Scoring & Ranking) - #82 (Phase 4 - Assembly Pipeline) - #83 (Phase 5 - Model Adapters) - #84 (Phase 6 - Caching Layer) ## Labels `phase-2`, `context`, `backend`
Sign in to join this conversation.