8 Commits

Author SHA1 Message Date
Felipe Cardoso
4c8f81368c fix(docker): add NEXT_PUBLIC_API_BASE_URL to frontend containers
When running in Docker, the frontend needs to use 'http://backend:8000'
as the backend URL for Next.js rewrites. This env var is set to use
the Docker service name for proper container-to-container communication.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 09:23:50 +01:00
Felipe Cardoso
efbe91ce14 fix(frontend): use configurable backend URL in Next.js rewrite
The rewrite was using 'http://backend:8000' which only resolves inside
Docker network. When running Next.js locally (npm run dev), the hostname
'backend' doesn't exist, causing ENOTFOUND errors.

Now uses NEXT_PUBLIC_API_BASE_URL env var with fallback to localhost:8000
for local development. In Docker, set NEXT_PUBLIC_API_BASE_URL=http://backend:8000.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 09:22:44 +01:00
Felipe Cardoso
5d646779c9 fix(frontend): preserve /api prefix in Next.js rewrite
The rewrite was incorrectly configured:
- Before: /api/:path* -> http://backend:8000/:path* (strips /api)
- After: /api/:path* -> http://backend:8000/api/:path* (preserves /api)

This was causing requests to /api/v1/agent-types to be sent to
http://backend:8000/v1/agent-types instead of the correct path.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 03:12:08 +01:00
Felipe Cardoso
5a4d93df26 feat(dashboard): use real API data and add 3 more demo projects
Dashboard changes:
- Update useDashboard hook to fetch real projects from API
- Calculate stats (active projects, agents, issues) from real data
- Keep pending approvals as mock (no backend endpoint yet)

Demo data additions:
- API Gateway Modernization project (active, complex)
- Customer Analytics Dashboard project (completed)
- DevOps Pipeline Automation project (active, complex)
- Added sprints, agent instances, and issues for each new project

Total demo data: 6 projects, 14 agents, 22 issues

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 03:10:10 +01:00
Felipe Cardoso
7ef217be39 feat(demo): tie all demo projects to admin user
- Update demo_data.json to use "__admin__" as owner_email for all projects
- Add admin user lookup in load_demo_data() with special "__admin__" key
- Remove notification_email from project settings (not a valid field)

This ensures demo projects are visible to the admin user when logged in.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 03:00:07 +01:00
Felipe Cardoso
20159c5865 fix(knowledge-base): ensure pgvector extension before pool creation
register_vector() requires the vector type to exist in PostgreSQL before
it can register the type codec. Move CREATE EXTENSION to a separate
_ensure_pgvector_extension() method that runs before pool creation.

This fixes the "unknown type: public.vector" error on fresh databases.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 02:55:02 +01:00
Felipe Cardoso
f9a72fcb34 fix(models): use enum values instead of names for PostgreSQL
Add values_callable to all enum columns so SQLAlchemy serializes using
the enum's .value (lowercase) instead of .name (uppercase). PostgreSQL
enum types defined in migrations use lowercase values.

Fixes: invalid input value for enum autonomy_level: "MILESTONE"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 02:53:45 +01:00
Felipe Cardoso
fcb0a5f86a fix(models): add explicit enum names to match migration types
SQLAlchemy's Enum() auto-generates type names from Python class names
(e.g., AutonomyLevel -> autonomylevel), but migrations defined them
with underscores (e.g., autonomy_level). This mismatch caused:

  "type 'autonomylevel' does not exist"

Added explicit name parameters to all enum columns to match the
migration-defined type names:
- autonomy_level, project_status, project_complexity, client_mode
- agent_status, sprint_status
- issue_type, issue_status, issue_priority, sync_status

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-06 02:48:10 +01:00
11 changed files with 466 additions and 142 deletions

View File

@@ -267,6 +267,15 @@ async def load_demo_data(session: AsyncSession) -> None:
await session.flush() await session.flush()
# Add admin user to map with special "__admin__" key
# This allows demo data to reference the admin user as owner
superuser_email = settings.FIRST_SUPERUSER_EMAIL or "admin@example.com"
admin_user = await user_crud.get_by_email(session, email=superuser_email)
if admin_user:
user_map["__admin__"] = admin_user
user_map[str(admin_user.email)] = admin_user
logger.debug(f"Added admin user to map: {admin_user.email}")
# ======================== # ========================
# 3. Load Agent Types Map (for FK resolution) # 3. Load Agent Types Map (for FK resolution)
# ======================== # ========================

View File

@@ -62,7 +62,11 @@ class AgentInstance(Base, UUIDMixin, TimestampMixin):
# Status tracking # Status tracking
status: Column[AgentStatus] = Column( status: Column[AgentStatus] = Column(
Enum(AgentStatus), Enum(
AgentStatus,
name="agent_status",
values_callable=lambda x: [e.value for e in x],
),
default=AgentStatus.IDLE, default=AgentStatus.IDLE,
nullable=False, nullable=False,
index=True, index=True,

View File

@@ -59,7 +59,9 @@ class Issue(Base, UUIDMixin, TimestampMixin):
# Issue type (Epic, Story, Task, Bug) # Issue type (Epic, Story, Task, Bug)
type: Column[IssueType] = Column( type: Column[IssueType] = Column(
Enum(IssueType), Enum(
IssueType, name="issue_type", values_callable=lambda x: [e.value for e in x]
),
default=IssueType.TASK, default=IssueType.TASK,
nullable=False, nullable=False,
index=True, index=True,
@@ -78,14 +80,22 @@ class Issue(Base, UUIDMixin, TimestampMixin):
# Status and priority # Status and priority
status: Column[IssueStatus] = Column( status: Column[IssueStatus] = Column(
Enum(IssueStatus), Enum(
IssueStatus,
name="issue_status",
values_callable=lambda x: [e.value for e in x],
),
default=IssueStatus.OPEN, default=IssueStatus.OPEN,
nullable=False, nullable=False,
index=True, index=True,
) )
priority: Column[IssuePriority] = Column( priority: Column[IssuePriority] = Column(
Enum(IssuePriority), Enum(
IssuePriority,
name="issue_priority",
values_callable=lambda x: [e.value for e in x],
),
default=IssuePriority.MEDIUM, default=IssuePriority.MEDIUM,
nullable=False, nullable=False,
index=True, index=True,
@@ -132,7 +142,11 @@ class Issue(Base, UUIDMixin, TimestampMixin):
# Sync status with external tracker # Sync status with external tracker
sync_status: Column[SyncStatus] = Column( sync_status: Column[SyncStatus] = Column(
Enum(SyncStatus), Enum(
SyncStatus,
name="sync_status",
values_callable=lambda x: [e.value for e in x],
),
default=SyncStatus.SYNCED, default=SyncStatus.SYNCED,
nullable=False, nullable=False,
# Note: Index defined in __table_args__ as ix_issues_sync_status # Note: Index defined in __table_args__ as ix_issues_sync_status

View File

@@ -35,28 +35,44 @@ class Project(Base, UUIDMixin, TimestampMixin):
description = Column(Text, nullable=True) description = Column(Text, nullable=True)
autonomy_level: Column[AutonomyLevel] = Column( autonomy_level: Column[AutonomyLevel] = Column(
Enum(AutonomyLevel), Enum(
AutonomyLevel,
name="autonomy_level",
values_callable=lambda x: [e.value for e in x],
),
default=AutonomyLevel.MILESTONE, default=AutonomyLevel.MILESTONE,
nullable=False, nullable=False,
index=True, index=True,
) )
status: Column[ProjectStatus] = Column( status: Column[ProjectStatus] = Column(
Enum(ProjectStatus), Enum(
ProjectStatus,
name="project_status",
values_callable=lambda x: [e.value for e in x],
),
default=ProjectStatus.ACTIVE, default=ProjectStatus.ACTIVE,
nullable=False, nullable=False,
index=True, index=True,
) )
complexity: Column[ProjectComplexity] = Column( complexity: Column[ProjectComplexity] = Column(
Enum(ProjectComplexity), Enum(
ProjectComplexity,
name="project_complexity",
values_callable=lambda x: [e.value for e in x],
),
default=ProjectComplexity.MEDIUM, default=ProjectComplexity.MEDIUM,
nullable=False, nullable=False,
index=True, index=True,
) )
client_mode: Column[ClientMode] = Column( client_mode: Column[ClientMode] = Column(
Enum(ClientMode), Enum(
ClientMode,
name="client_mode",
values_callable=lambda x: [e.value for e in x],
),
default=ClientMode.AUTO, default=ClientMode.AUTO,
nullable=False, nullable=False,
index=True, index=True,

View File

@@ -57,7 +57,11 @@ class Sprint(Base, UUIDMixin, TimestampMixin):
# Status # Status
status: Column[SprintStatus] = Column( status: Column[SprintStatus] = Column(
Enum(SprintStatus), Enum(
SprintStatus,
name="sprint_status",
values_callable=lambda x: [e.value for e in x],
),
default=SprintStatus.PLANNED, default=SprintStatus.PLANNED,
nullable=False, nullable=False,
index=True, index=True,

View File

@@ -368,21 +368,20 @@
"name": "E-Commerce Platform Redesign", "name": "E-Commerce Platform Redesign",
"slug": "ecommerce-redesign", "slug": "ecommerce-redesign",
"description": "Complete redesign of the e-commerce platform with modern UX, improved checkout flow, and mobile-first approach.", "description": "Complete redesign of the e-commerce platform with modern UX, improved checkout flow, and mobile-first approach.",
"owner_email": "demo@example.com", "owner_email": "__admin__",
"autonomy_level": "milestone", "autonomy_level": "milestone",
"status": "active", "status": "active",
"complexity": "complex", "complexity": "complex",
"client_mode": "technical", "client_mode": "technical",
"settings": { "settings": {
"mcp_servers": ["gitea", "knowledge-base"], "mcp_servers": ["gitea", "knowledge-base"]
"notification_email": "demo@example.com"
} }
}, },
{ {
"name": "Mobile Banking App", "name": "Mobile Banking App",
"slug": "mobile-banking", "slug": "mobile-banking",
"description": "Secure mobile banking application with biometric authentication, transaction history, and real-time notifications.", "description": "Secure mobile banking application with biometric authentication, transaction history, and real-time notifications.",
"owner_email": "alice@acme.com", "owner_email": "__admin__",
"autonomy_level": "full_control", "autonomy_level": "full_control",
"status": "active", "status": "active",
"complexity": "complex", "complexity": "complex",
@@ -396,7 +395,7 @@
"name": "Internal HR Portal", "name": "Internal HR Portal",
"slug": "hr-portal", "slug": "hr-portal",
"description": "Employee self-service portal for leave requests, performance reviews, and document management.", "description": "Employee self-service portal for leave requests, performance reviews, and document management.",
"owner_email": "carol@globex.com", "owner_email": "__admin__",
"autonomy_level": "autonomous", "autonomy_level": "autonomous",
"status": "active", "status": "active",
"complexity": "medium", "complexity": "medium",
@@ -404,6 +403,45 @@
"settings": { "settings": {
"mcp_servers": ["gitea", "knowledge-base"] "mcp_servers": ["gitea", "knowledge-base"]
} }
},
{
"name": "API Gateway Modernization",
"slug": "api-gateway",
"description": "Migrate legacy REST API gateway to modern GraphQL-based architecture with improved caching and rate limiting.",
"owner_email": "__admin__",
"autonomy_level": "milestone",
"status": "active",
"complexity": "complex",
"client_mode": "technical",
"settings": {
"mcp_servers": ["gitea", "knowledge-base"]
}
},
{
"name": "Customer Analytics Dashboard",
"slug": "analytics-dashboard",
"description": "Real-time analytics dashboard for customer behavior insights, cohort analysis, and predictive modeling.",
"owner_email": "__admin__",
"autonomy_level": "autonomous",
"status": "completed",
"complexity": "medium",
"client_mode": "auto",
"settings": {
"mcp_servers": ["gitea", "knowledge-base"]
}
},
{
"name": "DevOps Pipeline Automation",
"slug": "devops-automation",
"description": "Automate CI/CD pipelines with AI-assisted deployments, rollback detection, and infrastructure as code.",
"owner_email": "__admin__",
"autonomy_level": "full_control",
"status": "active",
"complexity": "complex",
"client_mode": "technical",
"settings": {
"mcp_servers": ["gitea", "knowledge-base"]
}
} }
], ],
"sprints": [ "sprints": [
@@ -446,6 +484,56 @@
"end_date": "2026-01-20", "end_date": "2026-01-20",
"status": "active", "status": "active",
"planned_points": 18 "planned_points": 18
},
{
"project_slug": "api-gateway",
"name": "Sprint 1: GraphQL Schema",
"number": 1,
"goal": "Define GraphQL schema and implement core resolvers for existing REST endpoints.",
"start_date": "2025-12-23",
"end_date": "2026-01-06",
"status": "completed",
"planned_points": 21
},
{
"project_slug": "api-gateway",
"name": "Sprint 2: Caching Layer",
"number": 2,
"goal": "Implement Redis-based caching layer and query batching.",
"start_date": "2026-01-06",
"end_date": "2026-01-20",
"status": "active",
"planned_points": 26
},
{
"project_slug": "analytics-dashboard",
"name": "Sprint 1: Data Pipeline",
"number": 1,
"goal": "Set up data ingestion pipeline and real-time event processing.",
"start_date": "2025-11-15",
"end_date": "2025-11-29",
"status": "completed",
"planned_points": 18
},
{
"project_slug": "analytics-dashboard",
"name": "Sprint 2: Dashboard UI",
"number": 2,
"goal": "Build interactive dashboard with charts and filtering capabilities.",
"start_date": "2025-11-29",
"end_date": "2025-12-13",
"status": "completed",
"planned_points": 21
},
{
"project_slug": "devops-automation",
"name": "Sprint 1: Pipeline Templates",
"number": 1,
"goal": "Create reusable CI/CD pipeline templates for common deployment patterns.",
"start_date": "2026-01-06",
"end_date": "2026-01-20",
"status": "active",
"planned_points": 24
} }
], ],
"agent_instances": [ "agent_instances": [
@@ -501,6 +589,40 @@
"name": "Atlas", "name": "Atlas",
"status": "working", "status": "working",
"current_task": "Building employee dashboard API" "current_task": "Building employee dashboard API"
},
{
"project_slug": "api-gateway",
"agent_type_slug": "solutions-architect",
"name": "Orion",
"status": "working",
"current_task": "Designing caching strategy for GraphQL queries"
},
{
"project_slug": "api-gateway",
"agent_type_slug": "senior-engineer",
"name": "Cleo",
"status": "working",
"current_task": "Implementing Redis cache invalidation"
},
{
"project_slug": "devops-automation",
"agent_type_slug": "devops-engineer",
"name": "Volt",
"status": "working",
"current_task": "Creating Terraform modules for AWS ECS"
},
{
"project_slug": "devops-automation",
"agent_type_slug": "senior-engineer",
"name": "Sage",
"status": "idle"
},
{
"project_slug": "devops-automation",
"agent_type_slug": "qa-engineer",
"name": "Echo",
"status": "waiting",
"current_task": "Waiting for pipeline templates to test"
} }
], ],
"issues": [ "issues": [
@@ -639,6 +761,119 @@
"priority": "medium", "priority": "medium",
"labels": ["backend", "infrastructure", "storage"], "labels": ["backend", "infrastructure", "storage"],
"story_points": 5 "story_points": 5
},
{
"project_slug": "api-gateway",
"sprint_number": 2,
"type": "story",
"title": "Implement Redis caching layer",
"body": "As an API consumer, I want responses to be cached for improved performance.\n\n## Requirements\n- Cache GraphQL query results\n- Configurable TTL per query type\n- Cache invalidation on mutations\n- Cache hit/miss metrics",
"status": "in_progress",
"priority": "critical",
"labels": ["backend", "performance", "redis"],
"story_points": 8,
"assigned_agent_name": "Cleo"
},
{
"project_slug": "api-gateway",
"sprint_number": 2,
"type": "task",
"title": "Set up query batching and deduplication",
"body": "Implement DataLoader pattern for:\n- Batching multiple queries into single database calls\n- Deduplicating identical queries within request scope\n- N+1 query prevention",
"status": "open",
"priority": "high",
"labels": ["backend", "performance", "graphql"],
"story_points": 5
},
{
"project_slug": "api-gateway",
"sprint_number": 2,
"type": "task",
"title": "Implement rate limiting middleware",
"body": "Add rate limiting to prevent API abuse:\n- Per-user rate limits\n- Per-IP fallback for anonymous requests\n- Sliding window algorithm\n- Custom limits per operation type",
"status": "open",
"priority": "high",
"labels": ["backend", "security", "middleware"],
"story_points": 5,
"assigned_agent_name": "Orion"
},
{
"project_slug": "api-gateway",
"sprint_number": 2,
"type": "bug",
"title": "Fix N+1 query in user resolver",
"body": "The user resolver is making separate database calls for each user's organization.\n\n## Steps to Reproduce\n1. Query users with organization field\n2. Check database logs\n3. Observe N+1 queries",
"status": "open",
"priority": "high",
"labels": ["bug", "performance", "graphql"],
"story_points": 3
},
{
"project_slug": "analytics-dashboard",
"sprint_number": 2,
"type": "story",
"title": "Build cohort analysis charts",
"body": "As a product manager, I want to analyze user cohorts over time.\n\n## Features\n- Weekly/monthly cohort grouping\n- Retention curve visualization\n- Cohort comparison view",
"status": "closed",
"priority": "high",
"labels": ["frontend", "charts", "analytics"],
"story_points": 8
},
{
"project_slug": "analytics-dashboard",
"sprint_number": 2,
"type": "task",
"title": "Implement real-time event streaming",
"body": "Set up WebSocket connection for live event updates:\n- Event type filtering\n- Buffering for high-volume periods\n- Reconnection handling",
"status": "closed",
"priority": "high",
"labels": ["backend", "websocket", "realtime"],
"story_points": 5
},
{
"project_slug": "devops-automation",
"sprint_number": 1,
"type": "epic",
"title": "CI/CD Pipeline Templates",
"body": "Create reusable pipeline templates for common deployment patterns.\n\n## Templates Needed\n- Node.js applications\n- Python applications\n- Docker-based deployments\n- Kubernetes deployments",
"status": "in_progress",
"priority": "critical",
"labels": ["infrastructure", "cicd", "templates"],
"story_points": null
},
{
"project_slug": "devops-automation",
"sprint_number": 1,
"type": "story",
"title": "Create Terraform modules for AWS ECS",
"body": "As a DevOps engineer, I want Terraform modules for ECS deployments.\n\n## Modules\n- ECS cluster configuration\n- Service and task definitions\n- Load balancer integration\n- Auto-scaling policies",
"status": "in_progress",
"priority": "high",
"labels": ["terraform", "aws", "ecs"],
"story_points": 8,
"assigned_agent_name": "Volt"
},
{
"project_slug": "devops-automation",
"sprint_number": 1,
"type": "task",
"title": "Set up Gitea Actions runners",
"body": "Configure self-hosted Gitea Actions runners:\n- Docker-in-Docker support\n- Caching for npm/pip\n- Secrets management\n- Resource limits",
"status": "open",
"priority": "high",
"labels": ["infrastructure", "gitea", "cicd"],
"story_points": 5
},
{
"project_slug": "devops-automation",
"sprint_number": 1,
"type": "task",
"title": "Implement rollback detection system",
"body": "AI-assisted rollback detection:\n- Monitor deployment health metrics\n- Automatic rollback triggers\n- Notification system\n- Post-rollback analysis",
"status": "open",
"priority": "medium",
"labels": ["ai", "monitoring", "automation"],
"story_points": 8
} }
] ]
} }

View File

@@ -288,6 +288,7 @@ services:
environment: environment:
- NODE_ENV=production - NODE_ENV=production
- NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL} - NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL}
- NEXT_PUBLIC_API_BASE_URL=http://backend:8000
depends_on: depends_on:
backend: backend:
condition: service_healthy condition: service_healthy

View File

@@ -249,6 +249,7 @@ services:
environment: environment:
- NODE_ENV=development - NODE_ENV=development
- NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL} - NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL}
- NEXT_PUBLIC_API_BASE_URL=http://backend:8000
depends_on: depends_on:
backend: backend:
condition: service_healthy condition: service_healthy

View File

@@ -74,12 +74,14 @@ const nextConfig: NextConfig = {
]; ];
}, },
// Ensure we can connect to the backend in Docker // Proxy API requests to backend
// Use NEXT_PUBLIC_API_BASE_URL for the destination (defaults to localhost for local dev)
async rewrites() { async rewrites() {
const backendUrl = process.env.NEXT_PUBLIC_API_BASE_URL || 'http://localhost:8000';
return [ return [
{ {
source: '/api/:path*', source: '/api/:path*',
destination: 'http://backend:8000/:path*', destination: `${backendUrl}/api/:path*`,
}, },
]; ];
}, },

View File

@@ -6,13 +6,15 @@
* - Recent projects * - Recent projects
* - Pending approvals * - Pending approvals
* *
* Uses mock data until backend endpoints are available. * Fetches real data from the API.
* *
* @see Issue #53 * @see Issue #53
*/ */
import { useQuery } from '@tanstack/react-query'; import { useQuery } from '@tanstack/react-query';
import type { Project, ProjectStatus } from '@/components/projects/types'; import { listProjects as listProjectsApi } from '@/lib/api/generated';
import type { ProjectResponse } from '@/lib/api/generated';
import type { AutonomyLevel, Project, ProjectStatus } from '@/components/projects/types';
// ============================================================================ // ============================================================================
// Types // Types
@@ -52,118 +54,70 @@ export interface DashboardData {
} }
// ============================================================================ // ============================================================================
// Mock Data // Helpers
// ============================================================================ // ============================================================================
const mockStats: DashboardStats = { /**
activeProjects: 3, * Format a date string as relative time (e.g., "2 minutes ago")
runningAgents: 8, */
openIssues: 24, function formatRelativeTime(dateStr: string): string {
pendingApprovals: 2, const date = new Date(dateStr);
}; const now = new Date();
const diffMs = now.getTime() - date.getTime();
const diffMins = Math.floor(diffMs / 60000);
const diffHours = Math.floor(diffMins / 60);
const diffDays = Math.floor(diffHours / 24);
const diffWeeks = Math.floor(diffDays / 7);
const diffMonths = Math.floor(diffDays / 30);
const mockProjects: DashboardProject[] = [ if (diffMins < 1) return 'Just now';
{ if (diffMins < 60) return `${diffMins} minute${diffMins > 1 ? 's' : ''} ago`;
id: 'proj-001', if (diffHours < 24) return `${diffHours} hour${diffHours > 1 ? 's' : ''} ago`;
name: 'E-Commerce Platform Redesign', if (diffDays < 7) return `${diffDays} day${diffDays > 1 ? 's' : ''} ago`;
description: 'Complete redesign of the e-commerce platform with modern UI/UX', if (diffWeeks < 4) return `${diffWeeks} week${diffWeeks > 1 ? 's' : ''} ago`;
status: 'active' as ProjectStatus, return `${diffMonths} month${diffMonths > 1 ? 's' : ''} ago`;
autonomy_level: 'milestone', }
created_at: '2025-11-15T10:00:00Z',
updated_at: '2025-12-30T14:30:00Z', /**
owner_id: 'user-001', * Maps API ProjectResponse to DashboardProject format
progress: 67, */
openIssues: 12, function mapToDashboardProject(
activeAgents: 4, project: ProjectResponse & Record<string, unknown>
currentSprint: 'Sprint 3', ): DashboardProject {
lastActivity: '2 minutes ago', const updatedAt = project.updated_at || project.created_at || new Date().toISOString();
}, const createdAt = project.created_at || new Date().toISOString();
{
id: 'proj-002', return {
name: 'Mobile Banking App', id: project.id,
description: 'Native mobile app for banking services with biometric authentication', name: project.name,
status: 'active' as ProjectStatus, description: project.description || undefined,
autonomy_level: 'autonomous', status: project.status as ProjectStatus,
created_at: '2025-11-20T09:00:00Z', autonomy_level: (project.autonomy_level || 'milestone') as AutonomyLevel,
updated_at: '2025-12-30T12:00:00Z', created_at: createdAt,
owner_id: 'user-001', updated_at: updatedAt,
progress: 45, owner_id: project.owner_id || 'unknown',
openIssues: 8, progress: (project.progress as number) || 0,
activeAgents: 5, openIssues: (project.openIssues as number) || project.issue_count || 0,
currentSprint: 'Sprint 2', activeAgents: (project.activeAgents as number) || project.agent_count || 0,
lastActivity: '15 minutes ago', currentSprint: project.active_sprint_name || undefined,
}, lastActivity: formatRelativeTime(updatedAt),
{ };
id: 'proj-003', }
name: 'Internal HR Portal',
description: 'Employee self-service portal for HR operations', // ============================================================================
status: 'paused' as ProjectStatus, // Mock Data (for pending approvals - no backend endpoint yet)
autonomy_level: 'full_control', // ============================================================================
created_at: '2025-10-01T08:00:00Z',
updated_at: '2025-12-28T16:00:00Z',
owner_id: 'user-001',
progress: 23,
openIssues: 5,
activeAgents: 0,
currentSprint: 'Sprint 1',
lastActivity: '2 days ago',
},
{
id: 'proj-004',
name: 'API Gateway Modernization',
description: 'Migrate legacy API gateway to cloud-native architecture',
status: 'active' as ProjectStatus,
autonomy_level: 'milestone',
created_at: '2025-12-01T11:00:00Z',
updated_at: '2025-12-30T10:00:00Z',
owner_id: 'user-001',
progress: 82,
openIssues: 3,
activeAgents: 2,
currentSprint: 'Sprint 4',
lastActivity: '1 hour ago',
},
{
id: 'proj-005',
name: 'Customer Analytics Dashboard',
description: 'Real-time analytics dashboard for customer behavior insights',
status: 'completed' as ProjectStatus,
autonomy_level: 'autonomous',
created_at: '2025-09-01T10:00:00Z',
updated_at: '2025-12-15T17:00:00Z',
owner_id: 'user-001',
progress: 100,
openIssues: 0,
activeAgents: 0,
lastActivity: '2 weeks ago',
},
{
id: 'proj-006',
name: 'DevOps Pipeline Automation',
description: 'Automate CI/CD pipelines with AI-assisted deployments',
status: 'active' as ProjectStatus,
autonomy_level: 'milestone',
created_at: '2025-12-10T14:00:00Z',
updated_at: '2025-12-30T09:00:00Z',
owner_id: 'user-001',
progress: 35,
openIssues: 6,
activeAgents: 3,
currentSprint: 'Sprint 1',
lastActivity: '30 minutes ago',
},
];
const mockApprovals: PendingApproval[] = [ const mockApprovals: PendingApproval[] = [
{ {
id: 'approval-001', id: 'approval-001',
type: 'sprint_boundary', type: 'sprint_boundary',
title: 'Sprint 3 Completion Review', title: 'Sprint 1 Completion Review',
description: 'Review sprint deliverables and approve transition to Sprint 4', description: 'Review sprint deliverables and approve transition to Sprint 2',
projectId: 'proj-001', projectId: 'proj-001',
projectName: 'E-Commerce Platform Redesign', projectName: 'E-Commerce Platform Redesign',
requestedBy: 'Product Owner Agent', requestedBy: 'Product Owner Agent',
requestedAt: '2025-12-30T14:00:00Z', requestedAt: new Date().toISOString(),
priority: 'high', priority: 'high',
}, },
{ {
@@ -171,10 +125,10 @@ const mockApprovals: PendingApproval[] = [
type: 'architecture_decision', type: 'architecture_decision',
title: 'Database Migration Strategy', title: 'Database Migration Strategy',
description: 'Approve PostgreSQL to CockroachDB migration plan', description: 'Approve PostgreSQL to CockroachDB migration plan',
projectId: 'proj-004', projectId: 'proj-002',
projectName: 'API Gateway Modernization', projectName: 'Mobile Banking App',
requestedBy: 'Architect Agent', requestedBy: 'Architect Agent',
requestedAt: '2025-12-30T10:30:00Z', requestedAt: new Date(Date.now() - 3600000).toISOString(),
priority: 'medium', priority: 'medium',
}, },
]; ];
@@ -192,17 +146,41 @@ export function useDashboard() {
return useQuery<DashboardData>({ return useQuery<DashboardData>({
queryKey: ['dashboard'], queryKey: ['dashboard'],
queryFn: async () => { queryFn: async () => {
// Simulate network delay // Fetch real projects from API
await new Promise((resolve) => setTimeout(resolve, 500)); const response = await listProjectsApi({
query: {
limit: 6,
},
});
// Return mock data if (response.error) {
// TODO: Replace with actual API call when backend is ready throw new Error('Failed to fetch dashboard data');
// const response = await apiClient.get('/api/v1/dashboard'); }
// return response.data;
const projects = response.data.data.map((p) =>
mapToDashboardProject(p as ProjectResponse & Record<string, unknown>)
);
// Sort by updated_at (most recent first)
projects.sort(
(a, b) =>
new Date(b.updated_at || b.created_at).getTime() -
new Date(a.updated_at || a.created_at).getTime()
);
// Calculate stats from real data
const activeProjects = projects.filter((p) => p.status === 'active').length;
const runningAgents = projects.reduce((sum, p) => sum + p.activeAgents, 0);
const openIssues = projects.reduce((sum, p) => sum + p.openIssues, 0);
return { return {
stats: mockStats, stats: {
recentProjects: mockProjects, activeProjects,
runningAgents,
openIssues,
pendingApprovals: mockApprovals.length,
},
recentProjects: projects,
pendingApprovals: mockApprovals, pendingApprovals: mockApprovals,
}; };
}, },
@@ -218,8 +196,24 @@ export function useDashboardStats() {
return useQuery<DashboardStats>({ return useQuery<DashboardStats>({
queryKey: ['dashboard', 'stats'], queryKey: ['dashboard', 'stats'],
queryFn: async () => { queryFn: async () => {
await new Promise((resolve) => setTimeout(resolve, 300)); const response = await listProjectsApi({
return mockStats; query: { limit: 100 },
});
if (response.error) {
throw new Error('Failed to fetch stats');
}
const projects = response.data.data.map((p) =>
mapToDashboardProject(p as ProjectResponse & Record<string, unknown>)
);
return {
activeProjects: projects.filter((p) => p.status === 'active').length,
runningAgents: projects.reduce((sum, p) => sum + p.activeAgents, 0),
openIssues: projects.reduce((sum, p) => sum + p.openIssues, 0),
pendingApprovals: mockApprovals.length,
};
}, },
staleTime: 30000, staleTime: 30000,
refetchInterval: 60000, refetchInterval: 60000,
@@ -235,8 +229,26 @@ export function useRecentProjects(limit: number = 6) {
return useQuery<DashboardProject[]>({ return useQuery<DashboardProject[]>({
queryKey: ['dashboard', 'recentProjects', limit], queryKey: ['dashboard', 'recentProjects', limit],
queryFn: async () => { queryFn: async () => {
await new Promise((resolve) => setTimeout(resolve, 400)); const response = await listProjectsApi({
return mockProjects.slice(0, limit); query: { limit },
});
if (response.error) {
throw new Error('Failed to fetch recent projects');
}
const projects = response.data.data.map((p) =>
mapToDashboardProject(p as ProjectResponse & Record<string, unknown>)
);
// Sort by updated_at (most recent first)
projects.sort(
(a, b) =>
new Date(b.updated_at || b.created_at).getTime() -
new Date(a.updated_at || a.created_at).getTime()
);
return projects;
}, },
staleTime: 30000, staleTime: 30000,
}); });
@@ -249,7 +261,7 @@ export function usePendingApprovals() {
return useQuery<PendingApproval[]>({ return useQuery<PendingApproval[]>({
queryKey: ['dashboard', 'pendingApprovals'], queryKey: ['dashboard', 'pendingApprovals'],
queryFn: async () => { queryFn: async () => {
await new Promise((resolve) => setTimeout(resolve, 300)); // TODO: Fetch from real API when endpoint exists
return mockApprovals; return mockApprovals;
}, },
staleTime: 30000, staleTime: 30000,

View File

@@ -57,6 +57,9 @@ class DatabaseManager:
async def initialize(self) -> None: async def initialize(self) -> None:
"""Initialize connection pool and create schema.""" """Initialize connection pool and create schema."""
try: try:
# First, create pgvector extension (required before register_vector in pool init)
await self._ensure_pgvector_extension()
self._pool = await asyncpg.create_pool( self._pool = await asyncpg.create_pool(
self._settings.database_url, self._settings.database_url,
min_size=2, min_size=2,
@@ -66,7 +69,7 @@ class DatabaseManager:
) )
logger.info("Database pool created successfully") logger.info("Database pool created successfully")
# Create schema # Create schema (tables and indexes)
await self._create_schema() await self._create_schema()
logger.info("Database schema initialized") logger.info("Database schema initialized")
@@ -77,6 +80,19 @@ class DatabaseManager:
cause=e, cause=e,
) )
async def _ensure_pgvector_extension(self) -> None:
"""Ensure pgvector extension exists before pool creation.
This must run before creating the connection pool because
register_vector() in _init_connection requires the extension to exist.
"""
conn = await asyncpg.connect(self._settings.database_url)
try:
await conn.execute("CREATE EXTENSION IF NOT EXISTS vector")
logger.info("pgvector extension ensured")
finally:
await conn.close()
async def _init_connection(self, conn: asyncpg.Connection) -> None: # type: ignore[type-arg] async def _init_connection(self, conn: asyncpg.Connection) -> None: # type: ignore[type-arg]
"""Initialize a connection with pgvector support.""" """Initialize a connection with pgvector support."""
await register_vector(conn) await register_vector(conn)
@@ -84,8 +100,7 @@ class DatabaseManager:
async def _create_schema(self) -> None: async def _create_schema(self) -> None:
"""Create database schema if not exists.""" """Create database schema if not exists."""
async with self.pool.acquire() as conn: async with self.pool.acquire() as conn:
# Enable pgvector extension # Note: pgvector extension is created in _ensure_pgvector_extension()
await conn.execute("CREATE EXTENSION IF NOT EXISTS vector")
# Create main embeddings table # Create main embeddings table
await conn.execute(""" await conn.execute("""
@@ -286,7 +301,14 @@ class DatabaseManager:
try: try:
async with self.acquire() as conn, conn.transaction(): async with self.acquire() as conn, conn.transaction():
# Wrap in transaction for all-or-nothing batch semantics # Wrap in transaction for all-or-nothing batch semantics
for project_id, collection, content, embedding, chunk_type, metadata in embeddings: for (
project_id,
collection,
content,
embedding,
chunk_type,
metadata,
) in embeddings:
content_hash = self.compute_content_hash(content) content_hash = self.compute_content_hash(content)
source_path = metadata.get("source_path") source_path = metadata.get("source_path")
start_line = metadata.get("start_line") start_line = metadata.get("start_line")
@@ -397,7 +419,9 @@ class DatabaseManager:
source_path=row["source_path"], source_path=row["source_path"],
start_line=row["start_line"], start_line=row["start_line"],
end_line=row["end_line"], end_line=row["end_line"],
file_type=FileType(row["file_type"]) if row["file_type"] else None, file_type=FileType(row["file_type"])
if row["file_type"]
else None,
metadata=row["metadata"] or {}, metadata=row["metadata"] or {},
content_hash=row["content_hash"], content_hash=row["content_hash"],
created_at=row["created_at"], created_at=row["created_at"],
@@ -476,7 +500,9 @@ class DatabaseManager:
source_path=row["source_path"], source_path=row["source_path"],
start_line=row["start_line"], start_line=row["start_line"],
end_line=row["end_line"], end_line=row["end_line"],
file_type=FileType(row["file_type"]) if row["file_type"] else None, file_type=FileType(row["file_type"])
if row["file_type"]
else None,
metadata=row["metadata"] or {}, metadata=row["metadata"] or {},
content_hash=row["content_hash"], content_hash=row["content_hash"],
created_at=row["created_at"], created_at=row["created_at"],