forked from cardosofelipe/fast-next-template
feat: Add MCP server stubs, development docs, and Docker updates
- Add MCP server skeleton implementations for all 7 planned servers (llm-gateway, knowledge-base, git, issues, filesystem, code-analysis, cicd) - Add comprehensive DEVELOPMENT.md with setup and usage instructions - Add BACKLOG.md with detailed phase planning - Update docker-compose.dev.yml with Redis and Celery workers - Update CLAUDE.md with Syndarix-specific context Addresses issues #16, #20, #21 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
90
CLAUDE.md
90
CLAUDE.md
@@ -53,7 +53,95 @@ docs/
|
||||
```
|
||||
|
||||
### Current Phase
|
||||
**Architecture Spikes** - Validating key decisions before implementation.
|
||||
**Backlog Population** - Creating detailed issues for Phase 0-1 implementation.
|
||||
|
||||
---
|
||||
|
||||
## Development Workflow & Standards
|
||||
|
||||
**CRITICAL: These rules are mandatory for all development work.**
|
||||
|
||||
### 1. Issue-Driven Development
|
||||
|
||||
**Every piece of work MUST have an issue in the Gitea tracker first.**
|
||||
|
||||
- Issue tracker: https://gitea.pragmazest.com/cardosofelipe/syndarix/issues
|
||||
- Create detailed, well-scoped issues before starting work
|
||||
- Structure issues to enable parallel work by multiple agents
|
||||
- Reference issues in commits and PRs
|
||||
|
||||
### 2. Git Hygiene
|
||||
|
||||
**Branch naming convention:** `feature/123-description`
|
||||
|
||||
- Every issue gets its own feature branch
|
||||
- No direct commits to `main` or `dev`
|
||||
- Keep branches focused and small
|
||||
- Delete branches after merge
|
||||
|
||||
**Workflow:**
|
||||
```
|
||||
main (production-ready)
|
||||
└── dev (integration branch)
|
||||
└── feature/123-description (issue branch)
|
||||
```
|
||||
|
||||
### 3. Testing Requirements
|
||||
|
||||
**All code must be tested. No exceptions.**
|
||||
|
||||
- **TDD preferred**: Write tests first when possible
|
||||
- **Test after**: If not TDD, write tests immediately after testable code
|
||||
- **Coverage types**: Unit, integration, functional, E2E as appropriate
|
||||
- **Minimum coverage**: Aim for >90% on new code
|
||||
|
||||
### 4. Code Review Process
|
||||
|
||||
**Before merging any feature branch, code must pass multi-agent review:**
|
||||
|
||||
| Check | Description |
|
||||
|-------|-------------|
|
||||
| Bug hunting | Logic errors, edge cases, race conditions |
|
||||
| Linting | `ruff check` passes with no errors |
|
||||
| Typing | `mypy` passes with no errors |
|
||||
| Formatting | Code follows style guidelines |
|
||||
| Performance | No obvious bottlenecks or N+1 queries |
|
||||
| Security | No vulnerabilities (OWASP top 10) |
|
||||
| Architecture | Follows established patterns and ADRs |
|
||||
|
||||
**Issue is NOT done until review passes with flying colors.**
|
||||
|
||||
### 5. QA Before Main
|
||||
|
||||
**Before merging `dev` into `main`:**
|
||||
|
||||
- Full test suite passes
|
||||
- Manual QA verification
|
||||
- Performance baseline check
|
||||
- Security scan
|
||||
- Code must be clean, functional, bug-free, well-architected, and secure
|
||||
|
||||
### 6. Implementation Plan Updates
|
||||
|
||||
- Keep `docs/architecture/IMPLEMENTATION_ROADMAP.md` updated
|
||||
- Mark completed items as work progresses
|
||||
- Add new items discovered during implementation
|
||||
|
||||
### 7. UI/UX Design Approval
|
||||
|
||||
**Frontend tasks involving UI/UX require design approval:**
|
||||
|
||||
1. **Design Issue**: Create issue with `design` label
|
||||
2. **Prototype**: Build interactive React prototype (navigable demo)
|
||||
3. **Review**: User inspects and provides feedback
|
||||
4. **Approval**: User approves before implementation begins
|
||||
5. **Implementation**: Follow approved design, respecting design system
|
||||
|
||||
**Design constraints:**
|
||||
- Prototypes: Best effort to match design system (not required)
|
||||
- Production code: MUST follow `frontend/docs/design-system/` strictly
|
||||
|
||||
---
|
||||
|
||||
### Key Extensions to Add (from PragmaStack base)
|
||||
- Celery + Redis for agent job queue
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
services:
|
||||
db:
|
||||
image: postgres:17-alpine
|
||||
image: pgvector/pgvector:pg17
|
||||
volumes:
|
||||
- postgres_data_dev:/var/lib/postgresql/data/
|
||||
environment:
|
||||
@@ -17,6 +17,21 @@ services:
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
ports:
|
||||
- "6379:6379"
|
||||
volumes:
|
||||
- redis_data_dev:/data
|
||||
command: redis-server --appendonly yes
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "ping"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
backend:
|
||||
build:
|
||||
context: ./backend
|
||||
@@ -37,13 +52,108 @@ services:
|
||||
- ENVIRONMENT=development
|
||||
- DEBUG=true
|
||||
- BACKEND_CORS_ORIGINS=${BACKEND_CORS_ORIGINS}
|
||||
- REDIS_URL=redis://redis:6379/0
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_healthy
|
||||
networks:
|
||||
- app-network
|
||||
command: ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
|
||||
|
||||
# Celery workers for background task processing (per ADR-003)
|
||||
celery-agent:
|
||||
build:
|
||||
context: ./backend
|
||||
dockerfile: Dockerfile
|
||||
target: development
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
- /app/.venv
|
||||
env_file:
|
||||
- .env
|
||||
environment:
|
||||
- DATABASE_URL=${DATABASE_URL}
|
||||
- REDIS_URL=redis://redis:6379/0
|
||||
- CELERY_QUEUE=agent
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_healthy
|
||||
networks:
|
||||
- app-network
|
||||
command: ["celery", "-A", "app.celery_app", "worker", "-Q", "agent", "-l", "info", "-c", "4"]
|
||||
|
||||
celery-git:
|
||||
build:
|
||||
context: ./backend
|
||||
dockerfile: Dockerfile
|
||||
target: development
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
- /app/.venv
|
||||
env_file:
|
||||
- .env
|
||||
environment:
|
||||
- DATABASE_URL=${DATABASE_URL}
|
||||
- REDIS_URL=redis://redis:6379/0
|
||||
- CELERY_QUEUE=git
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_healthy
|
||||
networks:
|
||||
- app-network
|
||||
command: ["celery", "-A", "app.celery_app", "worker", "-Q", "git", "-l", "info", "-c", "2"]
|
||||
|
||||
celery-sync:
|
||||
build:
|
||||
context: ./backend
|
||||
dockerfile: Dockerfile
|
||||
target: development
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
- /app/.venv
|
||||
env_file:
|
||||
- .env
|
||||
environment:
|
||||
- DATABASE_URL=${DATABASE_URL}
|
||||
- REDIS_URL=redis://redis:6379/0
|
||||
- CELERY_QUEUE=sync
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_healthy
|
||||
networks:
|
||||
- app-network
|
||||
command: ["celery", "-A", "app.celery_app", "worker", "-Q", "sync", "-l", "info", "-c", "2"]
|
||||
|
||||
celery-beat:
|
||||
build:
|
||||
context: ./backend
|
||||
dockerfile: Dockerfile
|
||||
target: development
|
||||
volumes:
|
||||
- ./backend:/app
|
||||
- /app/.venv
|
||||
env_file:
|
||||
- .env
|
||||
environment:
|
||||
- DATABASE_URL=${DATABASE_URL}
|
||||
- REDIS_URL=redis://redis:6379/0
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_healthy
|
||||
networks:
|
||||
- app-network
|
||||
command: ["celery", "-A", "app.celery_app", "beat", "-l", "info"]
|
||||
|
||||
frontend:
|
||||
build:
|
||||
context: ./frontend
|
||||
@@ -68,6 +178,7 @@ services:
|
||||
|
||||
volumes:
|
||||
postgres_data_dev:
|
||||
redis_data_dev:
|
||||
frontend_dev_modules:
|
||||
frontend_dev_next:
|
||||
|
||||
|
||||
1656
docs/BACKLOG.md
Normal file
1656
docs/BACKLOG.md
Normal file
File diff suppressed because it is too large
Load Diff
377
docs/DEVELOPMENT.md
Normal file
377
docs/DEVELOPMENT.md
Normal file
@@ -0,0 +1,377 @@
|
||||
# Syndarix Development Environment
|
||||
|
||||
This guide covers setting up and running the Syndarix development environment.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Docker & Docker Compose v2+
|
||||
- Python 3.12+
|
||||
- Node.js 20+
|
||||
- uv (Python package manager)
|
||||
- Git
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://gitea.pragmazest.com/cardosofelipe/syndarix.git
|
||||
cd syndarix
|
||||
|
||||
# Copy environment template
|
||||
cp .env.example .env
|
||||
|
||||
# Start all services
|
||||
docker-compose -f docker-compose.dev.yml up -d
|
||||
|
||||
# View logs
|
||||
docker-compose logs -f
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────┐
|
||||
│ Docker Compose Development │
|
||||
├─────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
|
||||
│ │ PostgreSQL│ │ Redis │ │ Backend │ │ Frontend │ │
|
||||
│ │ (pgvector)│ │ │ │ (FastAPI)│ │ (Next.js) │ │
|
||||
│ │ :5432 │ │ :6379 │ │ :8000 │ │ :3000 │ │
|
||||
│ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │
|
||||
│ │ │
|
||||
│ ┌─────────────────────────────────────────────────────────────────┐ │
|
||||
│ │ Celery Workers │ │
|
||||
│ │ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │ │
|
||||
│ │ │ celery-agent│ │ celery-git │ │ celery-sync │ │ │
|
||||
│ │ │ (4 workers)│ │ (2 workers)│ │ (2 workers)│ │ │
|
||||
│ │ └─────────────┘ └─────────────┘ └─────────────┘ │ │
|
||||
│ │ ┌─────────────┐ │ │
|
||||
│ │ │ celery-beat │ (Scheduler) │ │
|
||||
│ │ └─────────────┘ │ │
|
||||
│ └─────────────────────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Services
|
||||
|
||||
### Core Services
|
||||
|
||||
| Service | Port | Description |
|
||||
|---------|------|-------------|
|
||||
| db | 5432 | PostgreSQL 17 with pgvector extension |
|
||||
| redis | 6379 | Redis 7 for cache, pub/sub, Celery broker |
|
||||
| backend | 8000 | FastAPI backend |
|
||||
| frontend | 3000 | Next.js frontend |
|
||||
|
||||
### Celery Workers
|
||||
|
||||
| Worker | Queue | Concurrency | Purpose |
|
||||
|--------|-------|-------------|---------|
|
||||
| celery-agent | agent | 4 | Agent execution tasks |
|
||||
| celery-git | git | 2 | Git operations |
|
||||
| celery-sync | sync | 2 | Issue synchronization |
|
||||
| celery-beat | - | 1 | Scheduled task scheduler |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Copy `.env.example` to `.env` and configure:
|
||||
|
||||
```bash
|
||||
# Database
|
||||
POSTGRES_USER=syndarix
|
||||
POSTGRES_PASSWORD=your_secure_password
|
||||
POSTGRES_DB=syndarix
|
||||
DATABASE_URL=postgresql://syndarix:your_secure_password@db:5432/syndarix
|
||||
|
||||
# Redis
|
||||
REDIS_URL=redis://redis:6379/0
|
||||
|
||||
# Security
|
||||
SECRET_KEY=your_32_character_secret_key_here
|
||||
|
||||
# Frontend
|
||||
NEXT_PUBLIC_API_URL=http://localhost:8000
|
||||
|
||||
# CORS
|
||||
BACKEND_CORS_ORIGINS=["http://localhost:3000"]
|
||||
```
|
||||
|
||||
## Development Commands
|
||||
|
||||
### Backend
|
||||
|
||||
```bash
|
||||
# Enter backend directory
|
||||
cd backend
|
||||
|
||||
# Install dependencies
|
||||
uv sync
|
||||
|
||||
# Run tests
|
||||
IS_TEST=True uv run pytest
|
||||
|
||||
# Run with coverage
|
||||
IS_TEST=True uv run pytest --cov=app --cov-report=html
|
||||
|
||||
# Run linting
|
||||
uv run ruff check .
|
||||
|
||||
# Run type checking
|
||||
uv run mypy app
|
||||
|
||||
# Generate migration
|
||||
python migrate.py auto "description"
|
||||
|
||||
# Apply migrations
|
||||
python migrate.py upgrade
|
||||
```
|
||||
|
||||
### Frontend
|
||||
|
||||
```bash
|
||||
# Enter frontend directory
|
||||
cd frontend
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Run development server
|
||||
npm run dev
|
||||
|
||||
# Run tests
|
||||
npm test
|
||||
|
||||
# Run E2E tests
|
||||
npm run test:e2e
|
||||
|
||||
# Generate API client from backend OpenAPI spec
|
||||
npm run generate:api
|
||||
|
||||
# Type check
|
||||
npm run type-check
|
||||
```
|
||||
|
||||
### Docker Compose
|
||||
|
||||
```bash
|
||||
# Start all services
|
||||
docker-compose -f docker-compose.dev.yml up -d
|
||||
|
||||
# Start specific services
|
||||
docker-compose -f docker-compose.dev.yml up -d db redis backend
|
||||
|
||||
# View logs
|
||||
docker-compose -f docker-compose.dev.yml logs -f backend
|
||||
|
||||
# Rebuild after Dockerfile changes
|
||||
docker-compose -f docker-compose.dev.yml build --no-cache backend
|
||||
|
||||
# Stop all services
|
||||
docker-compose -f docker-compose.dev.yml down
|
||||
|
||||
# Stop and remove volumes (clean slate)
|
||||
docker-compose -f docker-compose.dev.yml down -v
|
||||
```
|
||||
|
||||
### Celery
|
||||
|
||||
```bash
|
||||
# View Celery worker status
|
||||
docker-compose exec celery-agent celery -A app.celery_app inspect active
|
||||
|
||||
# View scheduled tasks
|
||||
docker-compose exec celery-beat celery -A app.celery_app inspect scheduled
|
||||
|
||||
# Purge all queues (caution!)
|
||||
docker-compose exec celery-agent celery -A app.celery_app purge
|
||||
```
|
||||
|
||||
## Database Setup
|
||||
|
||||
### Enable pgvector Extension
|
||||
|
||||
The pgvector extension is automatically available with the `pgvector/pgvector:pg17` image.
|
||||
|
||||
To enable it in your database:
|
||||
|
||||
```sql
|
||||
CREATE EXTENSION IF NOT EXISTS vector;
|
||||
```
|
||||
|
||||
This is typically done in an Alembic migration.
|
||||
|
||||
### Migrations
|
||||
|
||||
```bash
|
||||
# Check current migration status
|
||||
python migrate.py current
|
||||
|
||||
# Generate new migration
|
||||
python migrate.py auto "Add agent tables"
|
||||
|
||||
# Apply migrations
|
||||
python migrate.py upgrade
|
||||
|
||||
# Rollback one migration
|
||||
python migrate.py downgrade -1
|
||||
```
|
||||
|
||||
## MCP Servers (Phase 2+)
|
||||
|
||||
MCP servers are located in `mcp-servers/`. Each server is a FastMCP application.
|
||||
|
||||
| Server | Priority | Status |
|
||||
|--------|----------|--------|
|
||||
| llm-gateway | 1 | Skeleton |
|
||||
| knowledge-base | 2 | Skeleton |
|
||||
| git | 3 | Skeleton |
|
||||
| issues | 4 | Skeleton |
|
||||
| filesystem | 5 | Skeleton |
|
||||
| code-analysis | 6 | Skeleton |
|
||||
| cicd | 7 | Skeleton |
|
||||
|
||||
To run an MCP server locally:
|
||||
|
||||
```bash
|
||||
cd mcp-servers/llm-gateway
|
||||
uv sync
|
||||
uv run python server.py
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Backend Tests
|
||||
|
||||
```bash
|
||||
# Unit tests (uses SQLite in-memory)
|
||||
IS_TEST=True uv run pytest
|
||||
|
||||
# E2E tests (requires Docker)
|
||||
make test-e2e
|
||||
|
||||
# Run specific test file
|
||||
IS_TEST=True uv run pytest tests/api/test_auth.py
|
||||
|
||||
# Run with verbose output
|
||||
IS_TEST=True uv run pytest -v
|
||||
```
|
||||
|
||||
### Frontend Tests
|
||||
|
||||
```bash
|
||||
# Unit tests
|
||||
npm test
|
||||
|
||||
# E2E tests (Playwright)
|
||||
npm run test:e2e
|
||||
|
||||
# E2E with UI
|
||||
npm run test:e2e:ui
|
||||
|
||||
# E2E debug mode
|
||||
npm run test:e2e:debug
|
||||
```
|
||||
|
||||
## Debugging
|
||||
|
||||
### Backend
|
||||
|
||||
```bash
|
||||
# View backend logs
|
||||
docker-compose logs -f backend
|
||||
|
||||
# Access container shell
|
||||
docker-compose exec backend bash
|
||||
|
||||
# Run Python REPL with app context
|
||||
docker-compose exec backend python
|
||||
>>> from app.core.config import settings
|
||||
>>> print(settings.PROJECT_NAME)
|
||||
```
|
||||
|
||||
### Database
|
||||
|
||||
```bash
|
||||
# Connect to PostgreSQL
|
||||
docker-compose exec db psql -U syndarix -d syndarix
|
||||
|
||||
# List tables
|
||||
\dt
|
||||
|
||||
# Describe table
|
||||
\d users
|
||||
```
|
||||
|
||||
### Redis
|
||||
|
||||
```bash
|
||||
# Connect to Redis CLI
|
||||
docker-compose exec redis redis-cli
|
||||
|
||||
# List all keys
|
||||
KEYS *
|
||||
|
||||
# Check pub/sub channels
|
||||
PUBSUB CHANNELS *
|
||||
```
|
||||
|
||||
## Common Issues
|
||||
|
||||
### "Port already in use"
|
||||
|
||||
Stop conflicting services or change ports in docker-compose.dev.yml.
|
||||
|
||||
### "Database connection refused"
|
||||
|
||||
Wait for PostgreSQL healthcheck to pass:
|
||||
```bash
|
||||
docker-compose logs db | grep "ready to accept connections"
|
||||
```
|
||||
|
||||
### "Import error" for new dependencies
|
||||
|
||||
Rebuild the container:
|
||||
```bash
|
||||
docker-compose build --no-cache backend
|
||||
```
|
||||
|
||||
### Migrations out of sync
|
||||
|
||||
```bash
|
||||
# Check current state
|
||||
python migrate.py current
|
||||
|
||||
# If needed, stamp current revision
|
||||
python migrate.py stamp head
|
||||
```
|
||||
|
||||
## IDE Setup
|
||||
|
||||
### VS Code
|
||||
|
||||
Recommended extensions:
|
||||
- Python
|
||||
- Pylance
|
||||
- Ruff
|
||||
- ESLint
|
||||
- Prettier
|
||||
- Docker
|
||||
- GitLens
|
||||
|
||||
### PyCharm
|
||||
|
||||
1. Set Python interpreter to uv's .venv
|
||||
2. Enable Ruff integration
|
||||
3. Configure pytest as test runner
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Set up your `.env` file with appropriate values
|
||||
2. Start the development environment: `docker-compose -f docker-compose.dev.yml up -d`
|
||||
3. Run migrations: `cd backend && python migrate.py upgrade`
|
||||
4. Access the frontend at http://localhost:3000
|
||||
5. Access the API docs at http://localhost:8000/docs
|
||||
|
||||
For architecture details, see [ARCHITECTURE.md](./architecture/ARCHITECTURE.md).
|
||||
70
mcp-servers/README.md
Normal file
70
mcp-servers/README.md
Normal file
@@ -0,0 +1,70 @@
|
||||
# Syndarix MCP Servers
|
||||
|
||||
Model Context Protocol (MCP) servers providing tool access for Syndarix agents.
|
||||
|
||||
Per [ADR-005](../docs/adrs/ADR-005-mcp-integration.md), all tools require explicit project scoping.
|
||||
|
||||
## Server Overview
|
||||
|
||||
| Server | Priority | Purpose | Phase |
|
||||
|--------|----------|---------|-------|
|
||||
| llm-gateway | 1 | LLM routing with failover and cost tracking | Phase 2 |
|
||||
| knowledge-base | 2 | RAG with pgvector for semantic search | Phase 2 |
|
||||
| git | 3 | Git operations (clone, commit, push, PR) | Phase 2 |
|
||||
| issues | 4 | Issue tracker sync (Gitea, GitHub, GitLab) | Phase 2 |
|
||||
| filesystem | 5 | Sandboxed file operations | Phase 5 |
|
||||
| code-analysis | 6 | AST parsing, linting, type checking | Phase 5 |
|
||||
| cicd | 7 | CI/CD pipeline management | Phase 5 |
|
||||
|
||||
## Architecture
|
||||
|
||||
Each MCP server is a FastMCP application that:
|
||||
1. Exposes tools via Model Context Protocol
|
||||
2. Requires `project_id` for all operations (explicit scoping)
|
||||
3. Uses Redis for pub/sub communication with agents
|
||||
4. Logs all operations to PostgreSQL for audit
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Agent Runner │
|
||||
│ │ │
|
||||
│ ┌─────────────┼─────────────┐ │
|
||||
│ ▼ ▼ ▼ │
|
||||
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
|
||||
│ │ LLM GW │ │ Git │ │ Issues │ ... (7 total) │
|
||||
│ │ MCP │ │ MCP │ │ MCP │ │
|
||||
│ └──────────┘ └──────────┘ └──────────┘ │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Running Locally
|
||||
|
||||
Each MCP server runs as a separate Docker container. See `docker-compose.dev.yml` for configuration.
|
||||
|
||||
```bash
|
||||
# Start all MCP servers (Phase 2+)
|
||||
docker-compose -f docker-compose.dev.yml up -d llm-gateway knowledge-base git issues
|
||||
|
||||
# View logs
|
||||
docker-compose logs -f llm-gateway
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
Each server follows the FastMCP pattern:
|
||||
|
||||
```python
|
||||
from fastmcp import FastMCP
|
||||
|
||||
mcp = FastMCP("server-name")
|
||||
|
||||
@mcp.tool()
|
||||
def my_tool(project_id: str, ...):
|
||||
"""Tool with required project scoping."""
|
||||
# Validate project access
|
||||
# Execute operation
|
||||
# Log for audit
|
||||
pass
|
||||
```
|
||||
|
||||
See individual server READMEs for specific tool documentation.
|
||||
23
mcp-servers/cicd/pyproject.toml
Normal file
23
mcp-servers/cicd/pyproject.toml
Normal file
@@ -0,0 +1,23 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-cicd"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix CI/CD MCP Server - Pipeline management"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"httpx>=0.27.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
47
mcp-servers/cicd/server.py
Normal file
47
mcp-servers/cicd/server.py
Normal file
@@ -0,0 +1,47 @@
|
||||
"""
|
||||
Syndarix CI/CD MCP Server.
|
||||
|
||||
Provides CI/CD pipeline management with:
|
||||
- Gitea Actions integration
|
||||
- GitHub Actions integration
|
||||
- Pipeline status monitoring
|
||||
|
||||
Phase 5 implementation.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
mcp = FastMCP(
|
||||
"syndarix-cicd",
|
||||
description="CI/CD pipeline management",
|
||||
)
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_pipeline_status(project_id: str, run_id: str | None = None) -> dict:
|
||||
"""Get CI/CD pipeline status."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def trigger_pipeline(project_id: str, workflow: str, ref: str = "main") -> dict:
|
||||
"""Trigger a CI/CD pipeline."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def list_workflows(project_id: str) -> dict:
|
||||
"""List available CI/CD workflows."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_logs(project_id: str, run_id: str, job: str | None = None) -> dict:
|
||||
"""Get logs from a pipeline run."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
23
mcp-servers/code-analysis/pyproject.toml
Normal file
23
mcp-servers/code-analysis/pyproject.toml
Normal file
@@ -0,0 +1,23 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-code-analysis"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix Code Analysis MCP Server - AST parsing, linting, type checking"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"tree-sitter>=0.21.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
47
mcp-servers/code-analysis/server.py
Normal file
47
mcp-servers/code-analysis/server.py
Normal file
@@ -0,0 +1,47 @@
|
||||
"""
|
||||
Syndarix Code Analysis MCP Server.
|
||||
|
||||
Provides code analysis with:
|
||||
- AST parsing via tree-sitter
|
||||
- Linting integration
|
||||
- Type checking
|
||||
|
||||
Phase 5 implementation.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
mcp = FastMCP(
|
||||
"syndarix-code-analysis",
|
||||
description="AST parsing, linting, type checking",
|
||||
)
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def parse_file(project_id: str, path: str) -> dict:
|
||||
"""Parse a file and return its AST."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def lint_file(project_id: str, path: str) -> dict:
|
||||
"""Run linting on a file."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def type_check(project_id: str, path: str | None = None) -> dict:
|
||||
"""Run type checking on file(s)."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def find_references(project_id: str, symbol: str, path: str) -> dict:
|
||||
"""Find all references to a symbol."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
23
mcp-servers/filesystem/pyproject.toml
Normal file
23
mcp-servers/filesystem/pyproject.toml
Normal file
@@ -0,0 +1,23 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-filesystem"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix Filesystem MCP Server - Sandboxed file operations"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"aiofiles>=24.0.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
47
mcp-servers/filesystem/server.py
Normal file
47
mcp-servers/filesystem/server.py
Normal file
@@ -0,0 +1,47 @@
|
||||
"""
|
||||
Syndarix Filesystem MCP Server.
|
||||
|
||||
Provides sandboxed file operations with:
|
||||
- Project-scoped file access
|
||||
- Read/write/delete operations
|
||||
- Directory listing
|
||||
|
||||
Phase 5 implementation.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
mcp = FastMCP(
|
||||
"syndarix-filesystem",
|
||||
description="Sandboxed file operations",
|
||||
)
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def read_file(project_id: str, path: str) -> dict:
|
||||
"""Read a file from the project workspace."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def write_file(project_id: str, path: str, content: str) -> dict:
|
||||
"""Write content to a file in the project workspace."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def list_directory(project_id: str, path: str = ".") -> dict:
|
||||
"""List contents of a directory."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def delete_file(project_id: str, path: str) -> dict:
|
||||
"""Delete a file from the project workspace."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
24
mcp-servers/git/pyproject.toml
Normal file
24
mcp-servers/git/pyproject.toml
Normal file
@@ -0,0 +1,24 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-git"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix Git MCP Server - Git operations and PR management"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"gitpython>=3.1.0",
|
||||
"httpx>=0.27.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
66
mcp-servers/git/server.py
Normal file
66
mcp-servers/git/server.py
Normal file
@@ -0,0 +1,66 @@
|
||||
"""
|
||||
Syndarix Git MCP Server.
|
||||
|
||||
Provides Git operations with:
|
||||
- Repository management (clone, pull, push)
|
||||
- Branch management
|
||||
- Commit operations
|
||||
- PR creation via Gitea/GitHub/GitLab APIs
|
||||
|
||||
Per ADR-009: Git Integration.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
mcp = FastMCP(
|
||||
"syndarix-git",
|
||||
description="Git operations and PR management",
|
||||
)
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def clone_repo(project_id: str, repo_url: str, branch: str = "main") -> dict:
|
||||
"""Clone a repository for a project."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def create_branch(project_id: str, branch_name: str, from_ref: str = "main") -> dict:
|
||||
"""Create a new branch."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def commit(project_id: str, message: str, files: list[str] | None = None) -> dict:
|
||||
"""Commit changes to the repository."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def push(project_id: str, branch: str, force: bool = False) -> dict:
|
||||
"""Push changes to remote."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def create_pr(
|
||||
project_id: str,
|
||||
title: str,
|
||||
body: str,
|
||||
head: str,
|
||||
base: str = "main",
|
||||
) -> dict:
|
||||
"""Create a pull request."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_diff(project_id: str, ref1: str = "HEAD", ref2: str | None = None) -> dict:
|
||||
"""Get diff between refs."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
23
mcp-servers/issues/pyproject.toml
Normal file
23
mcp-servers/issues/pyproject.toml
Normal file
@@ -0,0 +1,23 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-issues"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix Issues MCP Server - Issue tracker synchronization"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"httpx>=0.27.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
76
mcp-servers/issues/server.py
Normal file
76
mcp-servers/issues/server.py
Normal file
@@ -0,0 +1,76 @@
|
||||
"""
|
||||
Syndarix Issues MCP Server.
|
||||
|
||||
Provides issue tracker operations with:
|
||||
- Multi-provider support (Gitea, GitHub, GitLab)
|
||||
- Bi-directional sync
|
||||
- LWW conflict resolution
|
||||
|
||||
Per ADR-011: Issue Synchronization.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
mcp = FastMCP(
|
||||
"syndarix-issues",
|
||||
description="Issue tracker synchronization",
|
||||
)
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def list_issues(
|
||||
project_id: str,
|
||||
state: str = "open",
|
||||
labels: list[str] | None = None,
|
||||
) -> dict:
|
||||
"""List issues for a project."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_issue(project_id: str, issue_id: str) -> dict:
|
||||
"""Get a specific issue."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def create_issue(
|
||||
project_id: str,
|
||||
title: str,
|
||||
body: str,
|
||||
labels: list[str] | None = None,
|
||||
assignees: list[str] | None = None,
|
||||
) -> dict:
|
||||
"""Create a new issue."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def update_issue(
|
||||
project_id: str,
|
||||
issue_id: str,
|
||||
title: str | None = None,
|
||||
body: str | None = None,
|
||||
state: str | None = None,
|
||||
labels: list[str] | None = None,
|
||||
) -> dict:
|
||||
"""Update an existing issue."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def add_comment(project_id: str, issue_id: str, body: str) -> dict:
|
||||
"""Add a comment to an issue."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def sync_issues(project_id: str, full: bool = False) -> dict:
|
||||
"""Trigger issue sync for a project."""
|
||||
return {"status": "not_implemented", "project_id": project_id}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
24
mcp-servers/knowledge-base/pyproject.toml
Normal file
24
mcp-servers/knowledge-base/pyproject.toml
Normal file
@@ -0,0 +1,24 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-knowledge-base"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix Knowledge Base MCP Server - RAG with pgvector for semantic search"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"asyncpg>=0.29.0",
|
||||
"pgvector>=0.3.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
162
mcp-servers/knowledge-base/server.py
Normal file
162
mcp-servers/knowledge-base/server.py
Normal file
@@ -0,0 +1,162 @@
|
||||
"""
|
||||
Syndarix Knowledge Base MCP Server.
|
||||
|
||||
Provides RAG capabilities with:
|
||||
- pgvector for semantic search
|
||||
- Per-project collection isolation
|
||||
- Hybrid search (vector + keyword)
|
||||
- Chunking strategies for code, markdown, and text
|
||||
|
||||
Per ADR-008: Knowledge Base RAG Architecture.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
# Create MCP server
|
||||
mcp = FastMCP(
|
||||
"syndarix-knowledge-base",
|
||||
description="RAG with pgvector for semantic search",
|
||||
)
|
||||
|
||||
# Configuration
|
||||
DATABASE_URL = os.getenv("DATABASE_URL")
|
||||
REDIS_URL = os.getenv("REDIS_URL", "redis://localhost:6379/0")
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def search_knowledge(
|
||||
project_id: str,
|
||||
query: str,
|
||||
top_k: int = 10,
|
||||
search_type: str = "hybrid",
|
||||
filters: dict | None = None,
|
||||
) -> dict:
|
||||
"""
|
||||
Search the project knowledge base.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project (scopes to project collection)
|
||||
query: Search query text
|
||||
top_k: Number of results to return
|
||||
search_type: Search type (semantic, keyword, hybrid)
|
||||
filters: Optional filters (file_type, path_prefix, etc.)
|
||||
|
||||
Returns:
|
||||
List of matching documents with scores
|
||||
"""
|
||||
# TODO: Implement pgvector search
|
||||
# 1. Generate query embedding via LLM Gateway
|
||||
# 2. Search project-scoped collection
|
||||
# 3. Apply filters
|
||||
# 4. Return results with scores
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
"query": query,
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def ingest_document(
|
||||
project_id: str,
|
||||
content: str,
|
||||
source_path: str,
|
||||
doc_type: str = "text",
|
||||
metadata: dict | None = None,
|
||||
) -> dict:
|
||||
"""
|
||||
Ingest a document into the knowledge base.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project
|
||||
content: Document content
|
||||
source_path: Original file path for reference
|
||||
doc_type: Document type (code, markdown, text)
|
||||
metadata: Additional metadata
|
||||
|
||||
Returns:
|
||||
Ingestion result with chunk count
|
||||
"""
|
||||
# TODO: Implement document ingestion
|
||||
# 1. Apply chunking strategy based on doc_type
|
||||
# 2. Generate embeddings for chunks
|
||||
# 3. Store in project collection
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
"source_path": source_path,
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def ingest_repository(
|
||||
project_id: str,
|
||||
repo_path: str,
|
||||
include_patterns: list[str] | None = None,
|
||||
exclude_patterns: list[str] | None = None,
|
||||
) -> dict:
|
||||
"""
|
||||
Ingest an entire repository into the knowledge base.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project
|
||||
repo_path: Path to the repository
|
||||
include_patterns: Glob patterns to include (e.g., ["*.py", "*.md"])
|
||||
exclude_patterns: Glob patterns to exclude (e.g., ["node_modules/*"])
|
||||
|
||||
Returns:
|
||||
Ingestion summary with file and chunk counts
|
||||
"""
|
||||
# TODO: Implement repository ingestion
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
"repo_path": repo_path,
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def delete_document(
|
||||
project_id: str,
|
||||
source_path: str,
|
||||
) -> dict:
|
||||
"""
|
||||
Delete a document from the knowledge base.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project
|
||||
source_path: Original file path
|
||||
|
||||
Returns:
|
||||
Deletion result
|
||||
"""
|
||||
# TODO: Implement document deletion
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
"source_path": source_path,
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_collection_stats(project_id: str) -> dict:
|
||||
"""
|
||||
Get statistics for a project's knowledge base collection.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project
|
||||
|
||||
Returns:
|
||||
Collection statistics (document count, chunk count, etc.)
|
||||
"""
|
||||
# TODO: Implement collection stats
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
23
mcp-servers/llm-gateway/pyproject.toml
Normal file
23
mcp-servers/llm-gateway/pyproject.toml
Normal file
@@ -0,0 +1,23 @@
|
||||
[project]
|
||||
name = "syndarix-mcp-llm-gateway"
|
||||
version = "0.1.0"
|
||||
description = "Syndarix LLM Gateway MCP Server - Unified LLM access with failover and cost tracking"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
"fastmcp>=0.1.0",
|
||||
"litellm>=1.50.0",
|
||||
"redis>=5.0.0",
|
||||
"pydantic>=2.0.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.23.0",
|
||||
"ruff>=0.8.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 88
|
||||
148
mcp-servers/llm-gateway/server.py
Normal file
148
mcp-servers/llm-gateway/server.py
Normal file
@@ -0,0 +1,148 @@
|
||||
"""
|
||||
Syndarix LLM Gateway MCP Server.
|
||||
|
||||
Provides unified LLM access with:
|
||||
- Multi-provider support (Claude, GPT, Gemini, Qwen, DeepSeek)
|
||||
- Automatic failover chains
|
||||
- Cost tracking via LiteLLM callbacks
|
||||
- Model group routing (high-reasoning, code-generation, fast-response, cost-optimized)
|
||||
|
||||
Per ADR-004: LLM Provider Abstraction.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
# Create MCP server
|
||||
mcp = FastMCP(
|
||||
"syndarix-llm-gateway",
|
||||
description="Unified LLM access with failover and cost tracking",
|
||||
)
|
||||
|
||||
# Configuration
|
||||
REDIS_URL = os.getenv("REDIS_URL", "redis://localhost:6379/0")
|
||||
DATABASE_URL = os.getenv("DATABASE_URL")
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def chat_completion(
|
||||
project_id: str,
|
||||
agent_id: str,
|
||||
messages: list[dict],
|
||||
model_group: str = "high-reasoning",
|
||||
max_tokens: int = 4096,
|
||||
temperature: float = 0.7,
|
||||
) -> dict:
|
||||
"""
|
||||
Generate a chat completion using the specified model group.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project (required for cost attribution)
|
||||
agent_id: UUID of the agent instance making the request
|
||||
messages: List of message dicts with 'role' and 'content'
|
||||
model_group: Model routing group (high-reasoning, code-generation, fast-response, cost-optimized, self-hosted)
|
||||
max_tokens: Maximum tokens to generate
|
||||
temperature: Sampling temperature (0.0-2.0)
|
||||
|
||||
Returns:
|
||||
Completion response with content and usage statistics
|
||||
"""
|
||||
# TODO: Implement with LiteLLM
|
||||
# 1. Map model_group to primary model + fallbacks
|
||||
# 2. Check project budget before making request
|
||||
# 3. Make completion request with failover
|
||||
# 4. Log usage via callback
|
||||
# 5. Return response
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
"agent_id": agent_id,
|
||||
"model_group": model_group,
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_embeddings(
|
||||
project_id: str,
|
||||
texts: list[str],
|
||||
model: str = "text-embedding-3-small",
|
||||
) -> dict:
|
||||
"""
|
||||
Generate embeddings for the given texts.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project (required for cost attribution)
|
||||
texts: List of texts to embed
|
||||
model: Embedding model to use
|
||||
|
||||
Returns:
|
||||
List of embedding vectors
|
||||
"""
|
||||
# TODO: Implement with LiteLLM embeddings
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
"text_count": len(texts),
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def get_budget_status(project_id: str) -> dict:
|
||||
"""
|
||||
Get current budget status for a project.
|
||||
|
||||
Args:
|
||||
project_id: UUID of the project
|
||||
|
||||
Returns:
|
||||
Budget status with usage, limits, and percentage
|
||||
"""
|
||||
# TODO: Implement budget check from Redis
|
||||
return {
|
||||
"status": "not_implemented",
|
||||
"project_id": project_id,
|
||||
}
|
||||
|
||||
|
||||
@mcp.tool()
|
||||
async def list_available_models() -> dict:
|
||||
"""
|
||||
List all available models and their capabilities.
|
||||
|
||||
Returns:
|
||||
Dictionary of model groups and available models
|
||||
"""
|
||||
return {
|
||||
"model_groups": {
|
||||
"high-reasoning": {
|
||||
"primary": "claude-opus-4-5",
|
||||
"fallbacks": ["gpt-5.1-codex-max", "gemini-3-pro"],
|
||||
"description": "Complex analysis, architecture decisions",
|
||||
},
|
||||
"code-generation": {
|
||||
"primary": "gpt-5.1-codex-max",
|
||||
"fallbacks": ["claude-opus-4-5", "deepseek-v3.2"],
|
||||
"description": "Code writing and refactoring",
|
||||
},
|
||||
"fast-response": {
|
||||
"primary": "gemini-3-flash",
|
||||
"fallbacks": ["qwen3-235b", "deepseek-v3.2"],
|
||||
"description": "Quick tasks, simple queries",
|
||||
},
|
||||
"cost-optimized": {
|
||||
"primary": "qwen3-235b",
|
||||
"fallbacks": ["deepseek-v3.2"],
|
||||
"description": "High-volume, non-critical tasks",
|
||||
},
|
||||
"self-hosted": {
|
||||
"primary": "deepseek-v3.2",
|
||||
"fallbacks": ["qwen3-235b"],
|
||||
"description": "Privacy-sensitive, air-gapped",
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
mcp.run()
|
||||
Reference in New Issue
Block a user