Add SQLAlchemy models for the Agent Memory System: - WorkingMemory: Key-value storage with TTL for active sessions - Episode: Experiential memories from task executions - Fact: Semantic knowledge triples with confidence scores - Procedure: Learned skills and procedures with success tracking - MemoryConsolidationLog: Tracks consolidation jobs between memory tiers Create enums for memory system: - ScopeType: global, project, agent_type, agent_instance, session - EpisodeOutcome: success, failure, partial - ConsolidationType: working_to_episodic, episodic_to_semantic, etc. - ConsolidationStatus: pending, running, completed, failed Add Alembic migration (0005) for all memory tables with: - Foreign key relationships to projects, agent_instances, agent_types - Comprehensive indexes for query patterns - Unique constraints for key lookups and triple uniqueness - Vector embedding column placeholders (Text fallback until pgvector enabled) Fix timezone-naive datetime.now() in types.py TaskState (review feedback) Includes 30 unit tests for models and enums. Closes #88 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
74 lines
1.8 KiB
Python
74 lines
1.8 KiB
Python
# app/models/memory/enums.py
|
|
"""
|
|
Enums for Memory System database models.
|
|
|
|
These enums define the database-level constraints for memory types
|
|
and scoping levels.
|
|
"""
|
|
|
|
from enum import Enum as PyEnum
|
|
|
|
|
|
class ScopeType(str, PyEnum):
|
|
"""
|
|
Memory scope levels matching the memory service types.
|
|
|
|
GLOBAL: System-wide memories accessible by all
|
|
PROJECT: Project-scoped memories
|
|
AGENT_TYPE: Type-specific memories (shared by instances of same type)
|
|
AGENT_INSTANCE: Instance-specific memories
|
|
SESSION: Session-scoped ephemeral memories
|
|
"""
|
|
|
|
GLOBAL = "global"
|
|
PROJECT = "project"
|
|
AGENT_TYPE = "agent_type"
|
|
AGENT_INSTANCE = "agent_instance"
|
|
SESSION = "session"
|
|
|
|
|
|
class EpisodeOutcome(str, PyEnum):
|
|
"""
|
|
Outcome of an episode (task execution).
|
|
|
|
SUCCESS: Task completed successfully
|
|
FAILURE: Task failed
|
|
PARTIAL: Task partially completed
|
|
"""
|
|
|
|
SUCCESS = "success"
|
|
FAILURE = "failure"
|
|
PARTIAL = "partial"
|
|
|
|
|
|
class ConsolidationType(str, PyEnum):
|
|
"""
|
|
Types of memory consolidation operations.
|
|
|
|
WORKING_TO_EPISODIC: Transfer session state to episodic
|
|
EPISODIC_TO_SEMANTIC: Extract facts from episodes
|
|
EPISODIC_TO_PROCEDURAL: Extract procedures from episodes
|
|
PRUNING: Remove low-value memories
|
|
"""
|
|
|
|
WORKING_TO_EPISODIC = "working_to_episodic"
|
|
EPISODIC_TO_SEMANTIC = "episodic_to_semantic"
|
|
EPISODIC_TO_PROCEDURAL = "episodic_to_procedural"
|
|
PRUNING = "pruning"
|
|
|
|
|
|
class ConsolidationStatus(str, PyEnum):
|
|
"""
|
|
Status of a consolidation job.
|
|
|
|
PENDING: Job is queued
|
|
RUNNING: Job is currently executing
|
|
COMPLETED: Job finished successfully
|
|
FAILED: Job failed with errors
|
|
"""
|
|
|
|
PENDING = "pending"
|
|
RUNNING = "running"
|
|
COMPLETED = "completed"
|
|
FAILED = "failed"
|