fix(agents): properly initialize form with API data defaults

Root cause: The demo data's model_params was missing `top_p`, but the
Zod schema required all three fields (temperature, max_tokens, top_p).
This caused silent validation failures when editing agent types.

Fixes:
1. Add getInitialValues() that ensures all required fields have defaults
2. Handle nested validation errors in handleFormError (e.g., model_params.top_p)
3. Add useEffect to reset form when agentType changes
4. Add console.error logging for debugging validation failures
5. Update demo data to include top_p in all agent types

The form now properly initializes with safe defaults for any missing
fields from the API response, preventing silent validation failures.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2026-01-06 11:54:45 +01:00
parent c9d0d079b3
commit 600657adc4
2 changed files with 90 additions and 38 deletions

View File

@@ -9,7 +9,8 @@
"fallback_models": ["claude-haiku-3-5-20241022"],
"model_params": {
"temperature": 0.7,
"max_tokens": 4096
"max_tokens": 4096,
"top_p": 0.95
},
"mcp_servers": ["gitea", "knowledge-base"],
"tool_permissions": {
@@ -29,7 +30,8 @@
"fallback_models": ["claude-haiku-3-5-20241022"],
"model_params": {
"temperature": 0.5,
"max_tokens": 8192
"max_tokens": 8192,
"top_p": 0.95
},
"mcp_servers": ["gitea", "knowledge-base"],
"tool_permissions": {
@@ -49,7 +51,8 @@
"fallback_models": ["claude-haiku-3-5-20241022"],
"model_params": {
"temperature": 0.6,
"max_tokens": 8192
"max_tokens": 8192,
"top_p": 0.95
},
"mcp_servers": ["gitea", "knowledge-base", "filesystem"],
"tool_permissions": {
@@ -69,7 +72,8 @@
"fallback_models": ["claude-haiku-3-5-20241022"],
"model_params": {
"temperature": 0.3,
"max_tokens": 16384
"max_tokens": 16384,
"top_p": 0.95
},
"mcp_servers": ["gitea", "knowledge-base", "filesystem"],
"tool_permissions": {
@@ -89,7 +93,8 @@
"fallback_models": ["claude-haiku-3-5-20241022"],
"model_params": {
"temperature": 0.4,
"max_tokens": 8192
"max_tokens": 8192,
"top_p": 0.95
},
"mcp_servers": ["gitea", "knowledge-base", "filesystem"],
"tool_permissions": {
@@ -109,7 +114,8 @@
"fallback_models": ["claude-haiku-3-5-20241022"],
"model_params": {
"temperature": 0.4,
"max_tokens": 8192
"max_tokens": 8192,
"top_p": 0.95
},
"mcp_servers": ["gitea", "knowledge-base", "filesystem"],
"tool_permissions": {