kin: KIN-016 Агенты должны уметь говорить 'не могу'. Если агент не может выполнить задачу (нет доступа, не понимает, выходит за компетенцию) — он должен вернуть status: blocked с причиной, а не пытаться угадывать. PM при получении blocked от агента — эскалирует к человеку через GUI (уведомление) и Telegram (когда будет).
This commit is contained in:
parent
a605e9d110
commit
d9172fc17c
35 changed files with 2375 additions and 23 deletions
|
|
@ -65,3 +65,13 @@ Return ONLY valid JSON (no markdown, no explanation):
|
|||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
|
||||
If status is "blocked", include `"blocked_reason": "..."`.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the task (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially complete — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -67,3 +67,13 @@ Valid values for `status`: `"done"`, `"blocked"`, `"partial"`.
|
|||
|
||||
If status is "blocked", include `"blocked_reason": "..."`.
|
||||
If status is "partial", list what was completed and what remains in `notes`.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the task (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially complete — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -42,3 +42,13 @@ Return ONLY valid JSON:
|
|||
```
|
||||
|
||||
Every task from the input list MUST appear in exactly one category.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the audit (no codebase access, completely unreadable project), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess — return blocked immediately.
|
||||
|
|
|
|||
53
agents/prompts/business_analyst.md
Normal file
53
agents/prompts/business_analyst.md
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
You are a Business Analyst for the Kin multi-agent orchestrator.
|
||||
|
||||
Your job: analyze a new project idea and produce a structured business analysis report.
|
||||
|
||||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, description (free-text idea from the director)
|
||||
- PHASE: phase order in the research pipeline
|
||||
- TASK BRIEF: {text: <project description>, phase: "business_analyst", workflow: "research"}
|
||||
|
||||
## Your responsibilities
|
||||
|
||||
1. Analyze the business model viability
|
||||
2. Define target audience segments (demographics, psychographics, pain points)
|
||||
3. Outline monetization options (subscription, freemium, transactional, ads, etc.)
|
||||
4. Estimate market size (TAM/SAM/SOM if possible) from first principles
|
||||
5. Identify key business risks and success metrics (KPIs)
|
||||
|
||||
## Rules
|
||||
|
||||
- Base analysis on the project description only — do NOT search the web
|
||||
- Be specific and actionable — avoid generic statements
|
||||
- Flag any unclear requirements that block analysis
|
||||
- Keep output focused: 3-5 bullet points per section
|
||||
|
||||
## Output format
|
||||
|
||||
Return ONLY valid JSON (no markdown, no explanation):
|
||||
|
||||
```json
|
||||
{
|
||||
"status": "done",
|
||||
"business_model": "One-sentence description of how the business makes money",
|
||||
"target_audience": [
|
||||
{"segment": "Name", "description": "...", "pain_points": ["..."]}
|
||||
],
|
||||
"monetization": [
|
||||
{"model": "Subscription", "rationale": "...", "estimated_arpu": "..."}
|
||||
],
|
||||
"market_size": {
|
||||
"tam": "...",
|
||||
"sam": "...",
|
||||
"notes": "..."
|
||||
},
|
||||
"kpis": ["MAU", "conversion rate", "..."],
|
||||
"risks": ["..."],
|
||||
"open_questions": ["Questions that require director input"]
|
||||
}
|
||||
```
|
||||
|
||||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
If blocked, include `"blocked_reason": "..."`.
|
||||
|
|
@ -69,3 +69,13 @@ If only one file is changed, `fixes` still must be an array with one element.
|
|||
Valid values for `status`: `"fixed"`, `"blocked"`, `"needs_more_info"`.
|
||||
|
||||
If status is "blocked", include `"blocked_reason": "..."` instead of `"fixes"`.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the task (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially complete — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -33,3 +33,13 @@ Return ONLY valid JSON (no markdown, no explanation):
|
|||
}
|
||||
]
|
||||
```
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot analyze the pipeline output (no content provided, completely unreadable results), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -59,3 +59,13 @@ Valid values for `status`: `"done"`, `"blocked"`, `"partial"`.
|
|||
|
||||
If status is "blocked", include `"blocked_reason": "..."`.
|
||||
If status is "partial", list what was completed and what remains in `notes`.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the task (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially complete — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -39,3 +39,13 @@ Return ONLY valid JSON (no markdown, no explanation):
|
|||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot extract decisions (pipeline output is empty or completely unreadable), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess — return blocked immediately.
|
||||
|
|
|
|||
56
agents/prompts/legal_researcher.md
Normal file
56
agents/prompts/legal_researcher.md
Normal file
|
|
@ -0,0 +1,56 @@
|
|||
You are a Legal Researcher for the Kin multi-agent orchestrator.
|
||||
|
||||
Your job: identify legal and compliance requirements for a new project.
|
||||
|
||||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, description (free-text idea from the director)
|
||||
- PHASE: phase order in the research pipeline
|
||||
- TASK BRIEF: {text: <project description>, phase: "legal_researcher", workflow: "research"}
|
||||
- PREVIOUS STEP OUTPUT: output from prior research phases (if any)
|
||||
|
||||
## Your responsibilities
|
||||
|
||||
1. Identify relevant jurisdictions based on the product/target audience
|
||||
2. List required licenses, registrations, or certifications
|
||||
3. Flag KYC/AML requirements if the product handles money or identity
|
||||
4. Assess GDPR / data privacy obligations (EU, CCPA for US, etc.)
|
||||
5. Identify IP risks: trademarks, patents, open-source license conflicts
|
||||
6. Note any content moderation requirements (CSAM, hate speech laws, etc.)
|
||||
|
||||
## Rules
|
||||
|
||||
- Base analysis on the project description — infer jurisdiction from context
|
||||
- Flag HIGH/MEDIUM/LOW severity for each compliance item
|
||||
- Clearly state when professional legal advice is mandatory (do not substitute it)
|
||||
- Do NOT invent fictional laws; use real regulatory frameworks
|
||||
|
||||
## Output format
|
||||
|
||||
Return ONLY valid JSON (no markdown, no explanation):
|
||||
|
||||
```json
|
||||
{
|
||||
"status": "done",
|
||||
"jurisdictions": ["EU", "US", "RU"],
|
||||
"licenses_required": [
|
||||
{"name": "...", "jurisdiction": "...", "severity": "HIGH", "notes": "..."}
|
||||
],
|
||||
"kyc_aml": {
|
||||
"required": true,
|
||||
"frameworks": ["FATF", "EU AML Directive"],
|
||||
"notes": "..."
|
||||
},
|
||||
"data_privacy": [
|
||||
{"regulation": "GDPR", "obligations": ["..."], "severity": "HIGH"}
|
||||
],
|
||||
"ip_risks": ["..."],
|
||||
"content_moderation": ["..."],
|
||||
"must_consult_lawyer": true,
|
||||
"open_questions": ["Questions that require director input"]
|
||||
}
|
||||
```
|
||||
|
||||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
If blocked, include `"blocked_reason": "..."`.
|
||||
55
agents/prompts/market_researcher.md
Normal file
55
agents/prompts/market_researcher.md
Normal file
|
|
@ -0,0 +1,55 @@
|
|||
You are a Market Researcher for the Kin multi-agent orchestrator.
|
||||
|
||||
Your job: research the competitive landscape for a new project idea.
|
||||
|
||||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, description (free-text idea from the director)
|
||||
- PHASE: phase order in the research pipeline
|
||||
- TASK BRIEF: {text: <project description>, phase: "market_researcher", workflow: "research"}
|
||||
- PREVIOUS STEP OUTPUT: output from prior research phases (if any)
|
||||
|
||||
## Your responsibilities
|
||||
|
||||
1. Identify 3-7 direct competitors and 2-3 indirect competitors
|
||||
2. For each competitor: positioning, pricing, strengths, weaknesses
|
||||
3. Identify the niche opportunity (underserved segment or gap in market)
|
||||
4. Analyze user reviews/complaints about competitors (inferred from description)
|
||||
5. Assess market maturity: emerging / growing / mature / declining
|
||||
|
||||
## Rules
|
||||
|
||||
- Base analysis on the project description and prior phase outputs
|
||||
- Be specific: name real or plausible competitors with real positioning
|
||||
- Distinguish between direct (same product) and indirect (alternative solutions) competition
|
||||
- Do NOT pad output with generic statements
|
||||
|
||||
## Output format
|
||||
|
||||
Return ONLY valid JSON (no markdown, no explanation):
|
||||
|
||||
```json
|
||||
{
|
||||
"status": "done",
|
||||
"market_maturity": "growing",
|
||||
"direct_competitors": [
|
||||
{
|
||||
"name": "CompetitorName",
|
||||
"positioning": "...",
|
||||
"pricing": "...",
|
||||
"strengths": ["..."],
|
||||
"weaknesses": ["..."]
|
||||
}
|
||||
],
|
||||
"indirect_competitors": [
|
||||
{"name": "...", "why_indirect": "..."}
|
||||
],
|
||||
"niche_opportunity": "Description of the gap or underserved segment",
|
||||
"differentiation_options": ["..."],
|
||||
"open_questions": ["Questions that require director input"]
|
||||
}
|
||||
```
|
||||
|
||||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
If blocked, include `"blocked_reason": "..."`.
|
||||
63
agents/prompts/marketer.md
Normal file
63
agents/prompts/marketer.md
Normal file
|
|
@ -0,0 +1,63 @@
|
|||
You are a Marketer for the Kin multi-agent orchestrator.
|
||||
|
||||
Your job: design a go-to-market and growth strategy for a new project.
|
||||
|
||||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, description (free-text idea from the director)
|
||||
- PHASE: phase order in the research pipeline
|
||||
- TASK BRIEF: {text: <project description>, phase: "marketer", workflow: "research"}
|
||||
- PREVIOUS STEP OUTPUT: output from prior research phases (business, market, UX, etc.)
|
||||
|
||||
## Your responsibilities
|
||||
|
||||
1. Define the positioning statement (for whom, what problem, how different)
|
||||
2. Propose 3-5 acquisition channels with estimated CAC and effort level
|
||||
3. Outline SEO strategy: target keywords, content pillars, link building approach
|
||||
4. Identify conversion optimization patterns (landing page, onboarding, activation)
|
||||
5. Design a retention loop (notifications, email, community, etc.)
|
||||
6. Estimate budget ranges for each channel
|
||||
|
||||
## Rules
|
||||
|
||||
- Be specific: real channel names, real keyword examples, realistic CAC estimates
|
||||
- Prioritize by impact/effort ratio — not everything needs to be done
|
||||
- Use prior phase outputs (market research, UX) to inform the strategy
|
||||
- Budget estimates in USD ranges (e.g. "$500-2000/mo")
|
||||
|
||||
## Output format
|
||||
|
||||
Return ONLY valid JSON (no markdown, no explanation):
|
||||
|
||||
```json
|
||||
{
|
||||
"status": "done",
|
||||
"positioning": "For [target], [product] is the [category] that [key benefit] unlike [alternative]",
|
||||
"acquisition_channels": [
|
||||
{
|
||||
"channel": "SEO",
|
||||
"estimated_cac": "$5-20",
|
||||
"effort": "high",
|
||||
"timeline": "3-6 months",
|
||||
"priority": 1
|
||||
}
|
||||
],
|
||||
"seo_strategy": {
|
||||
"target_keywords": ["..."],
|
||||
"content_pillars": ["..."],
|
||||
"link_building": "..."
|
||||
},
|
||||
"conversion_patterns": ["..."],
|
||||
"retention_loop": "Description of how users come back",
|
||||
"budget_estimates": {
|
||||
"month_1": "$...",
|
||||
"month_3": "$...",
|
||||
"month_6": "$..."
|
||||
},
|
||||
"open_questions": ["Questions that require director input"]
|
||||
}
|
||||
```
|
||||
|
||||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
If blocked, include `"blocked_reason": "..."`.
|
||||
|
|
@ -5,7 +5,7 @@ Your job: decompose a task into a pipeline of specialist steps.
|
|||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, tech stack
|
||||
- PROJECT: id, name, tech stack, project_type (development | operations | research)
|
||||
- TASK: id, title, brief
|
||||
- DECISIONS: known issues, gotchas, workarounds for this project
|
||||
- MODULES: project module map
|
||||
|
|
@ -30,6 +30,22 @@ You receive:
|
|||
- Don't assign specialists who aren't needed.
|
||||
- If a task is blocked or unclear, say so — don't guess.
|
||||
|
||||
## Project type routing
|
||||
|
||||
**If project_type == "operations":**
|
||||
- ONLY use these roles: sysadmin, debugger, reviewer
|
||||
- NEVER assign: architect, frontend_dev, backend_dev, tester
|
||||
- Default route for scan/explore tasks: infra_scan (sysadmin → reviewer)
|
||||
- Default route for incident/debug tasks: infra_debug (sysadmin → debugger → reviewer)
|
||||
- The sysadmin agent connects via SSH — no local path is available
|
||||
|
||||
**If project_type == "research":**
|
||||
- Prefer: tech_researcher, architect, reviewer
|
||||
- No code changes — output is analysis and decisions only
|
||||
|
||||
**If project_type == "development"** (default):
|
||||
- Full specialist pool available
|
||||
|
||||
## Completion mode selection
|
||||
|
||||
Set `completion_mode` based on the following rules (in priority order):
|
||||
|
|
@ -87,3 +103,17 @@ Return ONLY valid JSON (no markdown, no explanation):
|
|||
"route_type": "debug"
|
||||
}
|
||||
```
|
||||
|
||||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
|
||||
If status is "blocked", include `"blocked_reason": "..."` and `"analysis": "..."` explaining why the task cannot be planned.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot plan the pipeline (task is completely ambiguous, no information to work with, or explicitly outside the system scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -68,6 +68,16 @@ Valid values for `test_coverage`: `"adequate"`, `"insufficient"`, `"missing"`.
|
|||
If verdict is "changes_requested", findings must be non-empty with actionable suggestions.
|
||||
If verdict is "blocked", include `"blocked_reason": "..."` (e.g. unable to read files).
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the review (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "verdict": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially review — return blocked immediately.
|
||||
|
||||
## Output field details
|
||||
|
||||
**security_issues** and **conventions_violations**: Each array element is an object with the following structure:
|
||||
|
|
|
|||
|
|
@ -71,3 +71,13 @@ Return ONLY valid JSON:
|
|||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the audit (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially audit — return blocked immediately.
|
||||
|
|
|
|||
105
agents/prompts/sysadmin.md
Normal file
105
agents/prompts/sysadmin.md
Normal file
|
|
@ -0,0 +1,105 @@
|
|||
You are a Sysadmin agent for the Kin multi-agent orchestrator.
|
||||
|
||||
Your job: connect to a remote server via SSH, scan it, and produce a structured map of what's running there.
|
||||
|
||||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, project_type=operations
|
||||
- SSH CONNECTION: host, user, key path, optional ProxyJump
|
||||
- TASK: id, title, brief describing what to scan or investigate
|
||||
- DECISIONS: known facts and gotchas about this server
|
||||
- MODULES: existing known components (if any)
|
||||
|
||||
## SSH Command Pattern
|
||||
|
||||
Use the Bash tool to run remote commands. Always use the explicit form:
|
||||
|
||||
```
|
||||
ssh -i {KEY} [-J {PROXYJUMP}] -o StrictHostKeyChecking=no -o BatchMode=yes {USER}@{HOST} "command"
|
||||
```
|
||||
|
||||
If no key path is provided, omit the `-i` flag and use default SSH auth.
|
||||
If no ProxyJump is set, omit the `-J` flag.
|
||||
|
||||
**SECURITY: Never use shell=True with user-supplied data. Always pass commands as explicit string arguments to ssh. Never interpolate untrusted input into shell commands.**
|
||||
|
||||
## Scan sequence
|
||||
|
||||
Run these commands one by one. Analyze each result before proceeding:
|
||||
|
||||
1. `uname -a && cat /etc/os-release` — OS version and kernel
|
||||
2. `docker ps --format 'table {{.Names}}\t{{.Image}}\t{{.Status}}\t{{.Ports}}'` — running containers
|
||||
3. `systemctl list-units --state=running --no-pager --plain --type=service 2>/dev/null | head -40` — running services
|
||||
4. `ss -tlnp 2>/dev/null || netstat -tlnp 2>/dev/null` — open ports
|
||||
5. `find /etc -maxdepth 3 -name "*.conf" -o -name "*.yaml" -o -name "*.yml" -o -name "*.env" 2>/dev/null | head -30` — config files
|
||||
6. `docker compose ls 2>/dev/null || docker-compose ls 2>/dev/null` — docker-compose projects
|
||||
7. If docker is present: `docker inspect $(docker ps -q) 2>/dev/null | python3 -c "import json,sys; [print(c['Name'], c.get('HostConfig',{}).get('Binds',[])) for c in json.load(sys.stdin)]" 2>/dev/null` — volume mounts
|
||||
8. For each key config found — read with `ssh ... "cat /path/to/config"` (skip files with obvious secrets unless needed for the task)
|
||||
|
||||
## Rules
|
||||
|
||||
- Run commands one by one — do NOT batch unrelated commands in one ssh call
|
||||
- Analyze output before next step — skip irrelevant follow-up commands
|
||||
- If a command fails (permission denied, not found) — note it and continue
|
||||
- If the task is specific (e.g. "find nginx config") — focus on relevant commands only
|
||||
- Never read files that clearly contain secrets (private keys, .env with passwords) unless the task explicitly requires it
|
||||
- If SSH connection fails entirely — return status "blocked" with the error
|
||||
|
||||
## Output format
|
||||
|
||||
Return ONLY valid JSON (no markdown, no explanation):
|
||||
|
||||
```json
|
||||
{
|
||||
"status": "done",
|
||||
"summary": "Brief description of what was found",
|
||||
"os": "Ubuntu 22.04 LTS, kernel 5.15.0",
|
||||
"services": [
|
||||
{"name": "nginx", "type": "systemd", "status": "running", "note": "web proxy"},
|
||||
{"name": "myapp", "type": "docker", "image": "myapp:1.2.3", "ports": ["80:8080"]}
|
||||
],
|
||||
"open_ports": [
|
||||
{"port": 80, "proto": "tcp", "process": "nginx"},
|
||||
{"port": 443, "proto": "tcp", "process": "nginx"},
|
||||
{"port": 5432, "proto": "tcp", "process": "postgres"}
|
||||
],
|
||||
"key_configs": [
|
||||
{"path": "/etc/nginx/nginx.conf", "note": "main nginx config"},
|
||||
{"path": "/opt/myapp/docker-compose.yml", "note": "app stack"}
|
||||
],
|
||||
"versions": {
|
||||
"docker": "24.0.5",
|
||||
"nginx": "1.24.0",
|
||||
"postgres": "15.3"
|
||||
},
|
||||
"decisions": [
|
||||
{
|
||||
"type": "gotcha",
|
||||
"title": "Brief title of discovered fact",
|
||||
"description": "Detailed description of the finding",
|
||||
"tags": ["server", "relevant-tag"]
|
||||
}
|
||||
],
|
||||
"modules": [
|
||||
{
|
||||
"name": "nginx",
|
||||
"type": "service",
|
||||
"path": "/etc/nginx",
|
||||
"description": "Reverse proxy, serving ports 80/443",
|
||||
"owner_role": "sysadmin"
|
||||
}
|
||||
],
|
||||
"files_read": ["/etc/nginx/nginx.conf"],
|
||||
"commands_run": ["uname -a", "docker ps"],
|
||||
"notes": "Any important caveats, things to investigate further, or follow-up tasks needed"
|
||||
}
|
||||
```
|
||||
|
||||
Valid status values: `"done"`, `"partial"` (if some commands failed), `"blocked"` (if SSH connection failed entirely).
|
||||
|
||||
If blocked, include `"blocked_reason": "..."` field.
|
||||
|
||||
The `decisions` array: add entries for every significant discovery — running services, non-standard configs, open ports, version info, gotchas. These will be saved to the project's knowledge base.
|
||||
|
||||
The `modules` array: add one entry per distinct service or component found. These will be registered as project modules.
|
||||
|
|
@ -90,3 +90,13 @@ Valid values for `status`: `"done"`, `"partial"`, `"blocked"`.
|
|||
- `"blocked"` — unable to proceed; include `"blocked_reason": "..."`.
|
||||
|
||||
If status is "partial", include `"partial_reason": "..."` explaining what was skipped.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the task (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially complete — return blocked immediately.
|
||||
|
|
|
|||
|
|
@ -65,3 +65,13 @@ Valid values for `status`: `"passed"`, `"failed"`, `"blocked"`.
|
|||
|
||||
If status is "failed", populate `"failures"` with `[{"test": "...", "error": "..."}]`.
|
||||
If status is "blocked", include `"blocked_reason": "..."`.
|
||||
|
||||
## Blocked Protocol
|
||||
|
||||
If you cannot perform the task (no file access, ambiguous requirements, task outside your scope), return this JSON **instead of** the normal output:
|
||||
|
||||
```json
|
||||
{"status": "blocked", "reason": "<clear explanation>", "blocked_at": "<ISO-8601 datetime>"}
|
||||
```
|
||||
|
||||
Use current datetime for `blocked_at`. Do NOT guess or partially complete — return blocked immediately.
|
||||
|
|
|
|||
57
agents/prompts/ux_designer.md
Normal file
57
agents/prompts/ux_designer.md
Normal file
|
|
@ -0,0 +1,57 @@
|
|||
You are a UX Designer for the Kin multi-agent orchestrator.
|
||||
|
||||
Your job: analyze UX patterns and design the user experience for a new project.
|
||||
|
||||
## Input
|
||||
|
||||
You receive:
|
||||
- PROJECT: id, name, description (free-text idea from the director)
|
||||
- PHASE: phase order in the research pipeline
|
||||
- TASK BRIEF: {text: <project description>, phase: "ux_designer", workflow: "research"}
|
||||
- PREVIOUS STEP OUTPUT: output from prior research phases (market research, etc.)
|
||||
|
||||
## Your responsibilities
|
||||
|
||||
1. Identify 2-3 user personas with goals, frustrations, and tech savviness
|
||||
2. Map the primary user journey (5-8 steps: Awareness → Onboarding → Core Value → Retention)
|
||||
3. Analyze UX patterns from competitors (from market research output if available)
|
||||
4. Identify the 3 most critical UX risks
|
||||
5. Propose key screens/flows as text wireframes (ASCII or numbered descriptions)
|
||||
|
||||
## Rules
|
||||
|
||||
- Focus on the most important user flows first — do not over-engineer
|
||||
- Base competitor UX analysis on prior research phase output
|
||||
- Wireframes must be text-based (no images), concise, actionable
|
||||
- Highlight where the UX must differentiate from competitors
|
||||
|
||||
## Output format
|
||||
|
||||
Return ONLY valid JSON (no markdown, no explanation):
|
||||
|
||||
```json
|
||||
{
|
||||
"status": "done",
|
||||
"personas": [
|
||||
{
|
||||
"name": "...",
|
||||
"role": "...",
|
||||
"goals": ["..."],
|
||||
"frustrations": ["..."],
|
||||
"tech_savviness": "medium"
|
||||
}
|
||||
],
|
||||
"user_journey": [
|
||||
{"step": 1, "name": "Awareness", "action": "...", "emotion": "..."}
|
||||
],
|
||||
"competitor_ux_analysis": "Summary of what competitors do well/poorly",
|
||||
"ux_risks": ["..."],
|
||||
"key_screens": [
|
||||
{"name": "Onboarding", "wireframe": "Step 1: ... Step 2: ..."}
|
||||
],
|
||||
"open_questions": ["Questions that require director input"]
|
||||
}
|
||||
```
|
||||
|
||||
Valid values for `status`: `"done"`, `"blocked"`.
|
||||
If blocked, include `"blocked_reason": "..."`.
|
||||
155
agents/runner.py
155
agents/runner.py
|
|
@ -111,7 +111,9 @@ def run_agent(
|
|||
# Determine working directory
|
||||
project = models.get_project(conn, project_id)
|
||||
working_dir = None
|
||||
if project and role in ("debugger", "frontend_dev", "backend_dev", "tester", "security"):
|
||||
# Operations projects have no local path — sysadmin works via SSH
|
||||
is_operations = project and project.get("project_type") == "operations"
|
||||
if not is_operations and project and role in ("debugger", "frontend_dev", "backend_dev", "tester", "security"):
|
||||
project_path = Path(project["path"]).expanduser()
|
||||
if project_path.is_dir():
|
||||
working_dir = str(project_path)
|
||||
|
|
@ -417,6 +419,35 @@ def run_audit(
|
|||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Blocked protocol detection
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _parse_agent_blocked(result: dict) -> dict | None:
|
||||
"""Detect semantic blocked status from a successful agent result.
|
||||
|
||||
Returns dict with {reason, blocked_at} if the agent's top-level JSON
|
||||
contains status='blocked'. Returns None otherwise.
|
||||
|
||||
Only checks top-level output object — never recurses into nested fields,
|
||||
to avoid false positives from nested task status fields.
|
||||
"""
|
||||
from datetime import datetime
|
||||
if not result.get("success"):
|
||||
return None
|
||||
output = result.get("output")
|
||||
if not isinstance(output, dict):
|
||||
return None
|
||||
# reviewer uses "verdict: blocked"; all others use "status: blocked"
|
||||
is_blocked = (output.get("status") == "blocked" or output.get("verdict") == "blocked")
|
||||
if not is_blocked:
|
||||
return None
|
||||
return {
|
||||
"reason": output.get("reason") or output.get("blocked_reason") or "",
|
||||
"blocked_at": output.get("blocked_at") or datetime.now().isoformat(),
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Permission error detection
|
||||
# ---------------------------------------------------------------------------
|
||||
|
|
@ -490,6 +521,88 @@ def _run_autocommit(
|
|||
_logger.warning("Autocommit failed for %s: %s", task_id, exc)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Sysadmin output: save server map to decisions and modules
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _save_sysadmin_output(
|
||||
conn: sqlite3.Connection,
|
||||
project_id: str,
|
||||
task_id: str,
|
||||
result: dict,
|
||||
) -> dict:
|
||||
"""Parse sysadmin agent JSON output and save decisions/modules to DB.
|
||||
|
||||
Idempotent: add_decision_if_new deduplicates, modules use INSERT OR IGNORE via
|
||||
add_module which has UNIQUE(project_id, name) — wraps IntegrityError silently.
|
||||
Returns {decisions_added, decisions_skipped, modules_added, modules_skipped}.
|
||||
"""
|
||||
raw = result.get("raw_output") or result.get("output") or ""
|
||||
if isinstance(raw, (dict, list)):
|
||||
raw = json.dumps(raw, ensure_ascii=False)
|
||||
|
||||
parsed = _try_parse_json(raw)
|
||||
if not isinstance(parsed, dict):
|
||||
return {"decisions_added": 0, "decisions_skipped": 0, "modules_added": 0, "modules_skipped": 0}
|
||||
|
||||
decisions_added = 0
|
||||
decisions_skipped = 0
|
||||
for item in (parsed.get("decisions") or []):
|
||||
if not isinstance(item, dict):
|
||||
continue
|
||||
d_type = item.get("type", "decision")
|
||||
if d_type not in VALID_DECISION_TYPES:
|
||||
d_type = "decision"
|
||||
d_title = (item.get("title") or "").strip()
|
||||
d_desc = (item.get("description") or "").strip()
|
||||
if not d_title or not d_desc:
|
||||
continue
|
||||
saved = models.add_decision_if_new(
|
||||
conn,
|
||||
project_id=project_id,
|
||||
type=d_type,
|
||||
title=d_title,
|
||||
description=d_desc,
|
||||
tags=item.get("tags") or ["server"],
|
||||
task_id=task_id,
|
||||
)
|
||||
if saved:
|
||||
decisions_added += 1
|
||||
else:
|
||||
decisions_skipped += 1
|
||||
|
||||
modules_added = 0
|
||||
modules_skipped = 0
|
||||
for item in (parsed.get("modules") or []):
|
||||
if not isinstance(item, dict):
|
||||
continue
|
||||
m_name = (item.get("name") or "").strip()
|
||||
m_type = (item.get("type") or "service").strip()
|
||||
m_path = (item.get("path") or "").strip()
|
||||
if not m_name:
|
||||
continue
|
||||
try:
|
||||
models.add_module(
|
||||
conn,
|
||||
project_id=project_id,
|
||||
name=m_name,
|
||||
type=m_type,
|
||||
path=m_path or m_name,
|
||||
description=item.get("description"),
|
||||
owner_role="sysadmin",
|
||||
)
|
||||
modules_added += 1
|
||||
except Exception:
|
||||
modules_skipped += 1
|
||||
|
||||
return {
|
||||
"decisions_added": decisions_added,
|
||||
"decisions_skipped": decisions_skipped,
|
||||
"modules_added": modules_added,
|
||||
"modules_skipped": modules_skipped,
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Auto-learning: extract decisions from pipeline results
|
||||
# ---------------------------------------------------------------------------
|
||||
|
|
@ -779,6 +892,46 @@ def run_pipeline(
|
|||
|
||||
results.append(result)
|
||||
|
||||
# Semantic blocked: agent ran successfully but returned status='blocked'
|
||||
blocked_info = _parse_agent_blocked(result)
|
||||
if blocked_info:
|
||||
if pipeline:
|
||||
models.update_pipeline(
|
||||
conn, pipeline["id"],
|
||||
status="failed",
|
||||
total_cost_usd=total_cost,
|
||||
total_tokens=total_tokens,
|
||||
total_duration_seconds=total_duration,
|
||||
)
|
||||
models.update_task(
|
||||
conn, task_id,
|
||||
status="blocked",
|
||||
blocked_reason=blocked_info["reason"],
|
||||
blocked_at=blocked_info["blocked_at"],
|
||||
blocked_agent_role=role,
|
||||
blocked_pipeline_step=str(i + 1),
|
||||
)
|
||||
error_msg = f"Step {i+1}/{len(steps)} ({role}) blocked: {blocked_info['reason']}"
|
||||
return {
|
||||
"success": False,
|
||||
"error": error_msg,
|
||||
"blocked_by": role,
|
||||
"blocked_reason": blocked_info["reason"],
|
||||
"steps_completed": i,
|
||||
"results": results,
|
||||
"total_cost_usd": total_cost,
|
||||
"total_tokens": total_tokens,
|
||||
"total_duration_seconds": total_duration,
|
||||
"pipeline_id": pipeline["id"] if pipeline else None,
|
||||
}
|
||||
|
||||
# Save sysadmin scan results immediately after a successful sysadmin step
|
||||
if role == "sysadmin" and result["success"] and not dry_run:
|
||||
try:
|
||||
_save_sysadmin_output(conn, project_id, task_id, result)
|
||||
except Exception:
|
||||
pass # Never block pipeline on sysadmin save errors
|
||||
|
||||
# Chain output to next step
|
||||
previous_output = result.get("raw_output") or result.get("output")
|
||||
if isinstance(previous_output, (dict, list)):
|
||||
|
|
|
|||
|
|
@ -81,6 +81,16 @@ specialists:
|
|||
context_rules:
|
||||
decisions_category: security
|
||||
|
||||
sysadmin:
|
||||
name: "Sysadmin"
|
||||
model: sonnet
|
||||
tools: [Bash, Read]
|
||||
description: "SSH-based server scanner: maps running services, open ports, configs, versions via remote commands"
|
||||
permissions: read_bash
|
||||
context_rules:
|
||||
decisions: all
|
||||
modules: all
|
||||
|
||||
tech_researcher:
|
||||
name: "Tech Researcher"
|
||||
model: sonnet
|
||||
|
|
@ -126,3 +136,11 @@ routes:
|
|||
api_research:
|
||||
steps: [tech_researcher, architect]
|
||||
description: "Study external API → integration plan"
|
||||
|
||||
infra_scan:
|
||||
steps: [sysadmin, reviewer]
|
||||
description: "SSH scan server → map services/ports/configs → review findings"
|
||||
|
||||
infra_debug:
|
||||
steps: [sysadmin, debugger, reviewer]
|
||||
description: "SSH diagnose → find root cause → verify fix plan"
|
||||
|
|
|
|||
68
cli/main.py
68
cli/main.py
|
|
@ -96,6 +96,74 @@ def project_add(ctx, id, name, path, tech_stack, status, priority, language):
|
|||
click.echo(f"Created project: {p['id']} ({p['name']})")
|
||||
|
||||
|
||||
@cli.command("new-project")
|
||||
@click.argument("description")
|
||||
@click.option("--id", "project_id", required=True, help="Project ID")
|
||||
@click.option("--name", required=True, help="Project name")
|
||||
@click.option("--path", required=True, help="Project path")
|
||||
@click.option("--roles", default="business,market,tech", show_default=True,
|
||||
help="Comma-separated roles: business,market,legal,tech,ux,marketer")
|
||||
@click.option("--tech-stack", default=None, help="Comma-separated tech stack")
|
||||
@click.option("--priority", type=int, default=5, show_default=True)
|
||||
@click.option("--language", default="ru", show_default=True)
|
||||
@click.pass_context
|
||||
def new_project(ctx, description, project_id, name, path, roles, tech_stack, priority, language):
|
||||
"""Create a new project with a sequential research phase pipeline.
|
||||
|
||||
DESCRIPTION — free-text project description for the agents.
|
||||
|
||||
Role aliases: business=business_analyst, market=market_researcher,
|
||||
legal=legal_researcher, tech=tech_researcher, ux=ux_designer, marketer=marketer.
|
||||
Architect is added automatically as the last phase.
|
||||
"""
|
||||
from core.phases import create_project_with_phases, validate_roles, ROLE_LABELS
|
||||
|
||||
_ALIASES = {
|
||||
"business": "business_analyst",
|
||||
"market": "market_researcher",
|
||||
"legal": "legal_researcher",
|
||||
"tech": "tech_researcher",
|
||||
"ux": "ux_designer",
|
||||
}
|
||||
|
||||
raw_roles = [r.strip().lower() for r in roles.split(",") if r.strip()]
|
||||
expanded = [_ALIASES.get(r, r) for r in raw_roles]
|
||||
clean_roles = validate_roles(expanded)
|
||||
if not clean_roles:
|
||||
click.echo("Error: no valid research roles specified.", err=True)
|
||||
raise SystemExit(1)
|
||||
|
||||
ts = [s.strip() for s in tech_stack.split(",") if s.strip()] if tech_stack else None
|
||||
conn = ctx.obj["conn"]
|
||||
|
||||
if models.get_project(conn, project_id):
|
||||
click.echo(f"Error: project '{project_id}' already exists.", err=True)
|
||||
raise SystemExit(1)
|
||||
|
||||
try:
|
||||
result = create_project_with_phases(
|
||||
conn, project_id, name, path,
|
||||
description=description,
|
||||
selected_roles=clean_roles,
|
||||
tech_stack=ts,
|
||||
priority=priority,
|
||||
language=language,
|
||||
)
|
||||
except ValueError as e:
|
||||
click.echo(f"Error: {e}", err=True)
|
||||
raise SystemExit(1)
|
||||
|
||||
click.echo(f"Created project: {result['project']['id']} ({result['project']['name']})")
|
||||
click.echo(f"Description: {description}")
|
||||
click.echo("")
|
||||
phases = result["phases"]
|
||||
rows = [
|
||||
[str(p["id"]), str(p["phase_order"] + 1), p["role"], p["status"], p.get("task_id") or "—"]
|
||||
for p in phases
|
||||
]
|
||||
click.echo(_table(["ID", "#", "Role", "Status", "Task"], rows))
|
||||
|
||||
|
||||
@project.command("list")
|
||||
@click.option("--status", default=None)
|
||||
@click.pass_context
|
||||
|
|
|
|||
|
|
@ -84,6 +84,10 @@ def build_context(
|
|||
conn, project_id, types=["convention"],
|
||||
)
|
||||
|
||||
elif role == "sysadmin":
|
||||
ctx["decisions"] = models.get_decisions(conn, project_id)
|
||||
ctx["modules"] = models.get_modules(conn, project_id)
|
||||
|
||||
elif role == "tester":
|
||||
# Minimal context — just the task spec
|
||||
pass
|
||||
|
|
@ -118,14 +122,22 @@ def _slim_task(task: dict) -> dict:
|
|||
|
||||
def _slim_project(project: dict) -> dict:
|
||||
"""Extract only relevant fields from a project."""
|
||||
return {
|
||||
result = {
|
||||
"id": project["id"],
|
||||
"name": project["name"],
|
||||
"path": project["path"],
|
||||
"tech_stack": project.get("tech_stack"),
|
||||
"language": project.get("language", "ru"),
|
||||
"execution_mode": project.get("execution_mode"),
|
||||
"project_type": project.get("project_type", "development"),
|
||||
}
|
||||
# Include SSH fields for operations projects
|
||||
if project.get("project_type") == "operations":
|
||||
result["ssh_host"] = project.get("ssh_host")
|
||||
result["ssh_user"] = project.get("ssh_user")
|
||||
result["ssh_key_path"] = project.get("ssh_key_path")
|
||||
result["ssh_proxy_jump"] = project.get("ssh_proxy_jump")
|
||||
return result
|
||||
|
||||
|
||||
def _extract_module_hint(task: dict | None) -> str | None:
|
||||
|
|
@ -159,6 +171,25 @@ def format_prompt(context: dict, role: str, prompt_template: str | None = None)
|
|||
if proj.get("tech_stack"):
|
||||
sections.append(f"Tech stack: {', '.join(proj['tech_stack'])}")
|
||||
sections.append(f"Path: {proj['path']}")
|
||||
project_type = proj.get("project_type", "development")
|
||||
sections.append(f"Project type: {project_type}")
|
||||
sections.append("")
|
||||
|
||||
# SSH connection info for operations projects
|
||||
if proj and proj.get("project_type") == "operations":
|
||||
ssh_host = proj.get("ssh_host") or ""
|
||||
ssh_user = proj.get("ssh_user") or ""
|
||||
ssh_key = proj.get("ssh_key_path") or ""
|
||||
ssh_proxy = proj.get("ssh_proxy_jump") or ""
|
||||
sections.append("## SSH Connection")
|
||||
if ssh_host:
|
||||
sections.append(f"Host: {ssh_host}")
|
||||
if ssh_user:
|
||||
sections.append(f"User: {ssh_user}")
|
||||
if ssh_key:
|
||||
sections.append(f"Key: {ssh_key}")
|
||||
if ssh_proxy:
|
||||
sections.append(f"ProxyJump: {ssh_proxy}")
|
||||
sections.append("")
|
||||
|
||||
# Task info
|
||||
|
|
|
|||
76
core/db.py
76
core/db.py
|
|
@ -23,6 +23,12 @@ CREATE TABLE IF NOT EXISTS projects (
|
|||
language TEXT DEFAULT 'ru',
|
||||
execution_mode TEXT NOT NULL DEFAULT 'review',
|
||||
deploy_command TEXT,
|
||||
project_type TEXT DEFAULT 'development',
|
||||
ssh_host TEXT,
|
||||
ssh_user TEXT,
|
||||
ssh_key_path TEXT,
|
||||
ssh_proxy_jump TEXT,
|
||||
description TEXT,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
|
|
@ -43,6 +49,9 @@ CREATE TABLE IF NOT EXISTS tasks (
|
|||
forgejo_issue_id INTEGER,
|
||||
execution_mode TEXT,
|
||||
blocked_reason TEXT,
|
||||
blocked_at DATETIME,
|
||||
blocked_agent_role TEXT,
|
||||
blocked_pipeline_step TEXT,
|
||||
dangerously_skipped BOOLEAN DEFAULT 0,
|
||||
revise_comment TEXT,
|
||||
category TEXT DEFAULT NULL,
|
||||
|
|
@ -95,6 +104,21 @@ CREATE TABLE IF NOT EXISTS modules (
|
|||
UNIQUE(project_id, name)
|
||||
);
|
||||
|
||||
-- Фазы исследования нового проекта (research workflow KIN-059)
|
||||
CREATE TABLE IF NOT EXISTS project_phases (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
project_id TEXT NOT NULL REFERENCES projects(id),
|
||||
role TEXT NOT NULL,
|
||||
phase_order INTEGER NOT NULL,
|
||||
status TEXT DEFAULT 'pending',
|
||||
task_id TEXT REFERENCES tasks(id),
|
||||
revise_count INTEGER DEFAULT 0,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_phases_project ON project_phases(project_id, phase_order);
|
||||
|
||||
-- Pipelines (история запусков)
|
||||
CREATE TABLE IF NOT EXISTS pipelines (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
|
|
@ -250,6 +274,16 @@ def _migrate(conn: sqlite3.Connection):
|
|||
conn.execute("ALTER TABLE tasks ADD COLUMN category TEXT DEFAULT NULL")
|
||||
conn.commit()
|
||||
|
||||
if "blocked_at" not in task_cols:
|
||||
conn.execute("ALTER TABLE tasks ADD COLUMN blocked_at DATETIME")
|
||||
conn.commit()
|
||||
if "blocked_agent_role" not in task_cols:
|
||||
conn.execute("ALTER TABLE tasks ADD COLUMN blocked_agent_role TEXT")
|
||||
conn.commit()
|
||||
if "blocked_pipeline_step" not in task_cols:
|
||||
conn.execute("ALTER TABLE tasks ADD COLUMN blocked_pipeline_step TEXT")
|
||||
conn.commit()
|
||||
|
||||
if "obsidian_vault_path" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN obsidian_vault_path TEXT")
|
||||
conn.commit()
|
||||
|
|
@ -258,10 +292,50 @@ def _migrate(conn: sqlite3.Connection):
|
|||
conn.execute("ALTER TABLE projects ADD COLUMN deploy_command TEXT")
|
||||
conn.commit()
|
||||
|
||||
# Migrate audit_log table (KIN-021)
|
||||
if "project_type" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN project_type TEXT DEFAULT 'development'")
|
||||
conn.commit()
|
||||
|
||||
if "ssh_host" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN ssh_host TEXT")
|
||||
conn.commit()
|
||||
|
||||
if "ssh_user" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN ssh_user TEXT")
|
||||
conn.commit()
|
||||
|
||||
if "ssh_key_path" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN ssh_key_path TEXT")
|
||||
conn.commit()
|
||||
|
||||
if "ssh_proxy_jump" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN ssh_proxy_jump TEXT")
|
||||
conn.commit()
|
||||
|
||||
if "description" not in proj_cols:
|
||||
conn.execute("ALTER TABLE projects ADD COLUMN description TEXT")
|
||||
conn.commit()
|
||||
|
||||
# Migrate audit_log + project_phases tables
|
||||
existing_tables = {r[0] for r in conn.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table'"
|
||||
).fetchall()}
|
||||
if "project_phases" not in existing_tables:
|
||||
conn.executescript("""
|
||||
CREATE TABLE IF NOT EXISTS project_phases (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
project_id TEXT NOT NULL REFERENCES projects(id),
|
||||
role TEXT NOT NULL,
|
||||
phase_order INTEGER NOT NULL,
|
||||
status TEXT DEFAULT 'pending',
|
||||
task_id TEXT REFERENCES tasks(id),
|
||||
revise_count INTEGER DEFAULT 0,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_phases_project ON project_phases(project_id, phase_order);
|
||||
""")
|
||||
conn.commit()
|
||||
if "audit_log" not in existing_tables:
|
||||
conn.executescript("""
|
||||
CREATE TABLE IF NOT EXISTS audit_log (
|
||||
|
|
|
|||
|
|
@ -72,14 +72,22 @@ def create_project(
|
|||
forgejo_repo: str | None = None,
|
||||
language: str = "ru",
|
||||
execution_mode: str = "review",
|
||||
project_type: str = "development",
|
||||
ssh_host: str | None = None,
|
||||
ssh_user: str | None = None,
|
||||
ssh_key_path: str | None = None,
|
||||
ssh_proxy_jump: str | None = None,
|
||||
description: str | None = None,
|
||||
) -> dict:
|
||||
"""Create a new project and return it as dict."""
|
||||
conn.execute(
|
||||
"""INSERT INTO projects (id, name, path, tech_stack, status, priority,
|
||||
pm_prompt, claude_md_path, forgejo_repo, language, execution_mode)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
pm_prompt, claude_md_path, forgejo_repo, language, execution_mode,
|
||||
project_type, ssh_host, ssh_user, ssh_key_path, ssh_proxy_jump, description)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)""",
|
||||
(id, name, path, _json_encode(tech_stack), status, priority,
|
||||
pm_prompt, claude_md_path, forgejo_repo, language, execution_mode),
|
||||
pm_prompt, claude_md_path, forgejo_repo, language, execution_mode,
|
||||
project_type, ssh_host, ssh_user, ssh_key_path, ssh_proxy_jump, description),
|
||||
)
|
||||
conn.commit()
|
||||
return get_project(conn, id)
|
||||
|
|
@ -612,3 +620,55 @@ def get_cost_summary(conn: sqlite3.Connection, days: int = 7) -> list[dict]:
|
|||
ORDER BY total_cost_usd DESC
|
||||
""", (f"-{days} days",)).fetchall()
|
||||
return _rows_to_list(rows)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Project Phases (KIN-059)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def create_phase(
|
||||
conn: sqlite3.Connection,
|
||||
project_id: str,
|
||||
role: str,
|
||||
phase_order: int,
|
||||
) -> dict:
|
||||
"""Create a research phase for a project."""
|
||||
cur = conn.execute(
|
||||
"""INSERT INTO project_phases (project_id, role, phase_order, status)
|
||||
VALUES (?, ?, ?, 'pending')""",
|
||||
(project_id, role, phase_order),
|
||||
)
|
||||
conn.commit()
|
||||
row = conn.execute(
|
||||
"SELECT * FROM project_phases WHERE id = ?", (cur.lastrowid,)
|
||||
).fetchone()
|
||||
return _row_to_dict(row)
|
||||
|
||||
|
||||
def get_phase(conn: sqlite3.Connection, phase_id: int) -> dict | None:
|
||||
"""Get a project phase by id."""
|
||||
row = conn.execute(
|
||||
"SELECT * FROM project_phases WHERE id = ?", (phase_id,)
|
||||
).fetchone()
|
||||
return _row_to_dict(row)
|
||||
|
||||
|
||||
def list_phases(conn: sqlite3.Connection, project_id: str) -> list[dict]:
|
||||
"""List all phases for a project ordered by phase_order."""
|
||||
rows = conn.execute(
|
||||
"SELECT * FROM project_phases WHERE project_id = ? ORDER BY phase_order",
|
||||
(project_id,),
|
||||
).fetchall()
|
||||
return _rows_to_list(rows)
|
||||
|
||||
|
||||
def update_phase(conn: sqlite3.Connection, phase_id: int, **fields) -> dict:
|
||||
"""Update phase fields. Auto-sets updated_at."""
|
||||
if not fields:
|
||||
return get_phase(conn, phase_id)
|
||||
fields["updated_at"] = datetime.now().isoformat()
|
||||
sets = ", ".join(f"{k} = ?" for k in fields)
|
||||
vals = list(fields.values()) + [phase_id]
|
||||
conn.execute(f"UPDATE project_phases SET {sets} WHERE id = ?", vals)
|
||||
conn.commit()
|
||||
return get_phase(conn, phase_id)
|
||||
|
|
|
|||
208
core/phases.py
Normal file
208
core/phases.py
Normal file
|
|
@ -0,0 +1,208 @@
|
|||
"""
|
||||
Kin — Research Phase Pipeline (KIN-059).
|
||||
|
||||
Sequential workflow: Director describes a new project, picks researcher roles,
|
||||
each phase produces a task for review. After approve → next phase activates.
|
||||
Architect always runs last (auto-added when any researcher is selected).
|
||||
"""
|
||||
|
||||
import sqlite3
|
||||
|
||||
from core import models
|
||||
|
||||
# Canonical order of research roles (architect always last)
|
||||
RESEARCH_ROLES = [
|
||||
"business_analyst",
|
||||
"market_researcher",
|
||||
"legal_researcher",
|
||||
"tech_researcher",
|
||||
"ux_designer",
|
||||
"marketer",
|
||||
"architect",
|
||||
]
|
||||
|
||||
# Human-readable labels
|
||||
ROLE_LABELS = {
|
||||
"business_analyst": "Business Analyst",
|
||||
"market_researcher": "Market Researcher",
|
||||
"legal_researcher": "Legal Researcher",
|
||||
"tech_researcher": "Tech Researcher",
|
||||
"ux_designer": "UX Designer",
|
||||
"marketer": "Marketer",
|
||||
"architect": "Architect",
|
||||
}
|
||||
|
||||
|
||||
def validate_roles(roles: list[str]) -> list[str]:
|
||||
"""Filter unknown roles, remove duplicates, strip 'architect' (auto-added later)."""
|
||||
seen: set[str] = set()
|
||||
result = []
|
||||
for r in roles:
|
||||
r = r.strip().lower()
|
||||
if r == "architect":
|
||||
continue
|
||||
if r in RESEARCH_ROLES and r not in seen:
|
||||
seen.add(r)
|
||||
result.append(r)
|
||||
return result
|
||||
|
||||
|
||||
def build_phase_order(selected_roles: list[str]) -> list[str]:
|
||||
"""Return roles in canonical RESEARCH_ROLES order, append architect if any selected."""
|
||||
ordered = [r for r in RESEARCH_ROLES if r in selected_roles and r != "architect"]
|
||||
if ordered:
|
||||
ordered.append("architect")
|
||||
return ordered
|
||||
|
||||
|
||||
def create_project_with_phases(
|
||||
conn: sqlite3.Connection,
|
||||
id: str,
|
||||
name: str,
|
||||
path: str,
|
||||
description: str,
|
||||
selected_roles: list[str],
|
||||
tech_stack: list | None = None,
|
||||
priority: int = 5,
|
||||
language: str = "ru",
|
||||
) -> dict:
|
||||
"""Create project + sequential research phases.
|
||||
|
||||
Returns {project, phases}.
|
||||
"""
|
||||
clean_roles = validate_roles(selected_roles)
|
||||
ordered_roles = build_phase_order(clean_roles)
|
||||
if not ordered_roles:
|
||||
raise ValueError("At least one research role must be selected")
|
||||
|
||||
project = models.create_project(
|
||||
conn, id, name, path,
|
||||
tech_stack=tech_stack, priority=priority, language=language,
|
||||
description=description,
|
||||
)
|
||||
|
||||
phases = []
|
||||
for idx, role in enumerate(ordered_roles):
|
||||
phase = models.create_phase(conn, id, role, idx)
|
||||
phases.append(phase)
|
||||
|
||||
# Activate the first phase immediately
|
||||
if phases:
|
||||
phases[0] = activate_phase(conn, phases[0]["id"])
|
||||
|
||||
return {"project": project, "phases": phases}
|
||||
|
||||
|
||||
def activate_phase(conn: sqlite3.Connection, phase_id: int) -> dict:
|
||||
"""Create a task for the phase and set it to active.
|
||||
|
||||
Task brief includes project description + phase context.
|
||||
"""
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
raise ValueError(f"Phase {phase_id} not found")
|
||||
|
||||
project = models.get_project(conn, phase["project_id"])
|
||||
if not project:
|
||||
raise ValueError(f"Project {phase['project_id']} not found")
|
||||
|
||||
task_id = models.next_task_id(conn, phase["project_id"], category=None)
|
||||
brief = {
|
||||
"text": project.get("description") or project["name"],
|
||||
"phase": phase["role"],
|
||||
"phase_order": phase["phase_order"],
|
||||
"workflow": "research",
|
||||
}
|
||||
task = models.create_task(
|
||||
conn, task_id, phase["project_id"],
|
||||
title=f"[Research] {ROLE_LABELS.get(phase['role'], phase['role'])}",
|
||||
assigned_role=phase["role"],
|
||||
brief=brief,
|
||||
status="pending",
|
||||
category=None,
|
||||
)
|
||||
updated = models.update_phase(conn, phase_id, task_id=task["id"], status="active")
|
||||
return updated
|
||||
|
||||
|
||||
def approve_phase(conn: sqlite3.Connection, phase_id: int) -> dict:
|
||||
"""Approve a phase, activate the next one (or finish workflow).
|
||||
|
||||
Returns {phase, next_phase|None}.
|
||||
"""
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
raise ValueError(f"Phase {phase_id} not found")
|
||||
if phase["status"] != "active":
|
||||
raise ValueError(f"Phase {phase_id} is not active (current: {phase['status']})")
|
||||
|
||||
updated = models.update_phase(conn, phase_id, status="approved")
|
||||
|
||||
# Find next pending phase
|
||||
all_phases = models.list_phases(conn, phase["project_id"])
|
||||
next_phase = None
|
||||
for p in all_phases:
|
||||
if p["phase_order"] > phase["phase_order"] and p["status"] == "pending":
|
||||
next_phase = p
|
||||
break
|
||||
|
||||
if next_phase:
|
||||
activated = activate_phase(conn, next_phase["id"])
|
||||
return {"phase": updated, "next_phase": activated}
|
||||
|
||||
return {"phase": updated, "next_phase": None}
|
||||
|
||||
|
||||
def reject_phase(conn: sqlite3.Connection, phase_id: int, reason: str) -> dict:
|
||||
"""Reject a phase (director rejects the research output entirely)."""
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
raise ValueError(f"Phase {phase_id} not found")
|
||||
if phase["status"] != "active":
|
||||
raise ValueError(f"Phase {phase_id} is not active (current: {phase['status']})")
|
||||
|
||||
return models.update_phase(conn, phase_id, status="rejected")
|
||||
|
||||
|
||||
def revise_phase(conn: sqlite3.Connection, phase_id: int, comment: str) -> dict:
|
||||
"""Request revision: create a new task for the same role with the comment.
|
||||
|
||||
Returns {phase, new_task}.
|
||||
"""
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
raise ValueError(f"Phase {phase_id} not found")
|
||||
if phase["status"] not in ("active", "revising"):
|
||||
raise ValueError(
|
||||
f"Phase {phase_id} cannot be revised (current: {phase['status']})"
|
||||
)
|
||||
|
||||
project = models.get_project(conn, phase["project_id"])
|
||||
if not project:
|
||||
raise ValueError(f"Project {phase['project_id']} not found")
|
||||
|
||||
new_task_id = models.next_task_id(conn, phase["project_id"], category=None)
|
||||
brief = {
|
||||
"text": project.get("description") or project["name"],
|
||||
"phase": phase["role"],
|
||||
"phase_order": phase["phase_order"],
|
||||
"workflow": "research",
|
||||
"revise_comment": comment,
|
||||
"revise_count": (phase.get("revise_count") or 0) + 1,
|
||||
}
|
||||
new_task = models.create_task(
|
||||
conn, new_task_id, phase["project_id"],
|
||||
title=f"[Research Revise] {ROLE_LABELS.get(phase['role'], phase['role'])}",
|
||||
assigned_role=phase["role"],
|
||||
brief=brief,
|
||||
status="pending",
|
||||
category=None,
|
||||
)
|
||||
new_revise_count = (phase.get("revise_count") or 0) + 1
|
||||
updated = models.update_phase(
|
||||
conn, phase_id,
|
||||
status="revising",
|
||||
task_id=new_task["id"],
|
||||
revise_count=new_revise_count,
|
||||
)
|
||||
return {"phase": updated, "new_task": new_task}
|
||||
|
|
@ -1132,4 +1132,136 @@ def test_sync_obsidian_after_patch_returns_sync_result_fields(client, tmp_path):
|
|||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "exported_decisions" in data
|
||||
assert "tasks_updated" in data
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# KIN-016 — GET /api/notifications — эскалации от заблокированных агентов
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_kin016_notifications_empty_when_no_blocked_tasks(client):
|
||||
"""KIN-016: GET /api/notifications возвращает [] когда нет заблокированных задач."""
|
||||
r = client.get("/api/notifications")
|
||||
assert r.status_code == 200
|
||||
assert r.json() == []
|
||||
|
||||
|
||||
def test_kin016_notifications_returns_blocked_task_as_escalation(client):
|
||||
"""KIN-016: заблокированная задача появляется в /api/notifications с корректными полями."""
|
||||
from core.db import init_db
|
||||
from core import models
|
||||
conn = init_db(api_module.DB_PATH)
|
||||
models.update_task(
|
||||
conn, "P1-001",
|
||||
status="blocked",
|
||||
blocked_reason="cannot access external API",
|
||||
blocked_at="2026-03-16T10:00:00",
|
||||
blocked_agent_role="debugger",
|
||||
blocked_pipeline_step="1",
|
||||
)
|
||||
conn.close()
|
||||
|
||||
r = client.get("/api/notifications")
|
||||
assert r.status_code == 200
|
||||
items = r.json()
|
||||
assert len(items) == 1
|
||||
|
||||
item = items[0]
|
||||
assert item["task_id"] == "P1-001"
|
||||
assert item["agent_role"] == "debugger"
|
||||
assert item["reason"] == "cannot access external API"
|
||||
assert item["pipeline_step"] == "1"
|
||||
assert item["blocked_at"] == "2026-03-16T10:00:00"
|
||||
|
||||
|
||||
def test_kin016_notifications_contains_project_id_and_title(client):
|
||||
"""KIN-016: уведомление содержит project_id и title задачи."""
|
||||
from core.db import init_db
|
||||
from core import models
|
||||
conn = init_db(api_module.DB_PATH)
|
||||
models.update_task(conn, "P1-001", status="blocked",
|
||||
blocked_reason="out of scope",
|
||||
blocked_agent_role="architect")
|
||||
conn.close()
|
||||
|
||||
r = client.get("/api/notifications")
|
||||
assert r.status_code == 200
|
||||
item = r.json()[0]
|
||||
assert item["project_id"] == "p1"
|
||||
assert item["title"] == "Fix bug"
|
||||
|
||||
|
||||
def test_kin016_notifications_filters_by_project_id(client):
|
||||
"""KIN-016: ?project_id= фильтрует уведомления по проекту."""
|
||||
from core.db import init_db
|
||||
from core import models
|
||||
conn = init_db(api_module.DB_PATH)
|
||||
# Создаём второй проект с заблокированной задачей
|
||||
models.create_project(conn, "p2", "P2", "/p2")
|
||||
models.create_task(conn, "P2-001", "p2", "Another task")
|
||||
models.update_task(conn, "P1-001", status="blocked",
|
||||
blocked_reason="reason A", blocked_agent_role="debugger")
|
||||
models.update_task(conn, "P2-001", status="blocked",
|
||||
blocked_reason="reason B", blocked_agent_role="tester")
|
||||
conn.close()
|
||||
|
||||
r = client.get("/api/notifications?project_id=p1")
|
||||
assert r.status_code == 200
|
||||
items = r.json()
|
||||
assert all(i["project_id"] == "p1" for i in items)
|
||||
assert len(items) == 1
|
||||
assert items[0]["task_id"] == "P1-001"
|
||||
|
||||
|
||||
def test_kin016_notifications_only_returns_blocked_status(client):
|
||||
"""KIN-016: задачи в статусе pending/review/done НЕ попадают в уведомления."""
|
||||
from core.db import init_db
|
||||
from core import models
|
||||
conn = init_db(api_module.DB_PATH)
|
||||
# Задача остаётся в pending (дефолт)
|
||||
assert models.get_task(conn, "P1-001")["status"] == "pending"
|
||||
conn.close()
|
||||
|
||||
r = client.get("/api/notifications")
|
||||
assert r.status_code == 200
|
||||
assert r.json() == []
|
||||
|
||||
|
||||
def test_kin016_pipeline_blocked_agent_stops_next_steps_integration(client):
|
||||
"""KIN-016: после blocked пайплайна задача блокируется, /api/notifications показывает её.
|
||||
|
||||
Интеграционный тест: pipeline → blocked → /api/notifications содержит task.
|
||||
"""
|
||||
import json
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
blocked_output = json.dumps({
|
||||
"result": json.dumps({"status": "blocked", "reason": "no repo access"}),
|
||||
})
|
||||
mock_proc = MagicMock()
|
||||
mock_proc.pid = 123
|
||||
|
||||
with patch("web.api.subprocess.Popen") as mock_popen:
|
||||
mock_popen.return_value = mock_proc
|
||||
r = client.post("/api/tasks/P1-001/run")
|
||||
assert r.status_code == 202
|
||||
|
||||
# Вручную помечаем задачу blocked (имитируем результат пайплайна)
|
||||
from core.db import init_db
|
||||
from core import models
|
||||
conn = init_db(api_module.DB_PATH)
|
||||
models.update_task(
|
||||
conn, "P1-001",
|
||||
status="blocked",
|
||||
blocked_reason="no repo access",
|
||||
blocked_agent_role="debugger",
|
||||
blocked_pipeline_step="1",
|
||||
)
|
||||
conn.close()
|
||||
|
||||
r = client.get("/api/notifications")
|
||||
assert r.status_code == 200
|
||||
items = r.json()
|
||||
assert len(items) == 1
|
||||
assert items[0]["task_id"] == "P1-001"
|
||||
assert items[0]["reason"] == "no repo access"
|
||||
assert items[0]["agent_role"] == "debugger"
|
||||
|
|
|
|||
|
|
@ -265,3 +265,83 @@ class TestReviseContext:
|
|||
prompt = format_prompt(ctx, "backend_dev", "You are a developer.")
|
||||
assert "## Director's revision request:" not in prompt
|
||||
assert "## Your previous output (before revision):" not in prompt
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# KIN-071: project_type and SSH context
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestOperationsProject:
|
||||
"""KIN-071: operations project_type propagates to context and prompt."""
|
||||
|
||||
@pytest.fixture
|
||||
def ops_conn(self):
|
||||
c = init_db(":memory:")
|
||||
models.create_project(
|
||||
c, "srv", "My Server", "",
|
||||
project_type="operations",
|
||||
ssh_host="10.0.0.1",
|
||||
ssh_user="root",
|
||||
ssh_key_path="~/.ssh/id_rsa",
|
||||
ssh_proxy_jump="jumpt",
|
||||
)
|
||||
models.create_task(c, "SRV-001", "srv", "Scan server")
|
||||
yield c
|
||||
c.close()
|
||||
|
||||
def test_slim_project_includes_project_type(self, ops_conn):
|
||||
"""KIN-071: _slim_project включает project_type."""
|
||||
ctx = build_context(ops_conn, "SRV-001", "sysadmin", "srv")
|
||||
assert ctx["project"]["project_type"] == "operations"
|
||||
|
||||
def test_slim_project_includes_ssh_fields_for_operations(self, ops_conn):
|
||||
"""KIN-071: _slim_project включает ssh_* поля для operations-проектов."""
|
||||
ctx = build_context(ops_conn, "SRV-001", "sysadmin", "srv")
|
||||
proj = ctx["project"]
|
||||
assert proj["ssh_host"] == "10.0.0.1"
|
||||
assert proj["ssh_user"] == "root"
|
||||
assert proj["ssh_key_path"] == "~/.ssh/id_rsa"
|
||||
assert proj["ssh_proxy_jump"] == "jumpt"
|
||||
|
||||
def test_slim_project_no_ssh_fields_for_development(self):
|
||||
"""KIN-071: development-проект не получает ssh_* в slim."""
|
||||
c = init_db(":memory:")
|
||||
models.create_project(c, "dev", "Dev", "/path")
|
||||
models.create_task(c, "DEV-001", "dev", "A task")
|
||||
ctx = build_context(c, "DEV-001", "backend_dev", "dev")
|
||||
assert "ssh_host" not in ctx["project"]
|
||||
c.close()
|
||||
|
||||
def test_sysadmin_context_gets_decisions_and_modules(self, ops_conn):
|
||||
"""KIN-071: sysadmin роль получает все decisions и modules."""
|
||||
models.add_module(ops_conn, "srv", "nginx", "service", "/etc/nginx")
|
||||
models.add_decision(ops_conn, "srv", "gotcha", "Port 80 in use", "conflict")
|
||||
ctx = build_context(ops_conn, "SRV-001", "sysadmin", "srv")
|
||||
assert "decisions" in ctx
|
||||
assert "modules" in ctx
|
||||
assert len(ctx["modules"]) == 1
|
||||
|
||||
def test_format_prompt_includes_ssh_connection_section(self, ops_conn):
|
||||
"""KIN-071: format_prompt добавляет '## SSH Connection' для operations."""
|
||||
ctx = build_context(ops_conn, "SRV-001", "sysadmin", "srv")
|
||||
prompt = format_prompt(ctx, "sysadmin", "You are sysadmin.")
|
||||
assert "## SSH Connection" in prompt
|
||||
assert "10.0.0.1" in prompt
|
||||
assert "root" in prompt
|
||||
assert "jumpt" in prompt
|
||||
|
||||
def test_format_prompt_no_ssh_section_for_development(self):
|
||||
"""KIN-071: development-проект не получает SSH-секцию в prompt."""
|
||||
c = init_db(":memory:")
|
||||
models.create_project(c, "dev", "Dev", "/path")
|
||||
models.create_task(c, "DEV-001", "dev", "A task")
|
||||
ctx = build_context(c, "DEV-001", "backend_dev", "dev")
|
||||
prompt = format_prompt(ctx, "backend_dev", "You are a dev.")
|
||||
assert "## SSH Connection" not in prompt
|
||||
c.close()
|
||||
|
||||
def test_format_prompt_includes_project_type(self, ops_conn):
|
||||
"""KIN-071: format_prompt включает Project type в секцию проекта."""
|
||||
ctx = build_context(ops_conn, "SRV-001", "sysadmin", "srv")
|
||||
prompt = format_prompt(ctx, "sysadmin", "You are sysadmin.")
|
||||
assert "Project type: operations" in prompt
|
||||
|
|
|
|||
|
|
@ -55,6 +55,40 @@ def test_update_project_tech_stack_json(conn):
|
|||
assert updated["tech_stack"] == ["python", "fastapi"]
|
||||
|
||||
|
||||
# -- project_type and SSH fields (KIN-071) --
|
||||
|
||||
def test_create_operations_project(conn):
|
||||
"""KIN-071: operations project stores SSH fields."""
|
||||
p = models.create_project(
|
||||
conn, "srv1", "My Server", "",
|
||||
project_type="operations",
|
||||
ssh_host="10.0.0.1",
|
||||
ssh_user="root",
|
||||
ssh_key_path="~/.ssh/id_rsa",
|
||||
ssh_proxy_jump="jumpt",
|
||||
)
|
||||
assert p["project_type"] == "operations"
|
||||
assert p["ssh_host"] == "10.0.0.1"
|
||||
assert p["ssh_user"] == "root"
|
||||
assert p["ssh_key_path"] == "~/.ssh/id_rsa"
|
||||
assert p["ssh_proxy_jump"] == "jumpt"
|
||||
|
||||
|
||||
def test_create_development_project_defaults(conn):
|
||||
"""KIN-071: development is default project_type."""
|
||||
p = models.create_project(conn, "devp", "Dev Project", "/path")
|
||||
assert p["project_type"] == "development"
|
||||
assert p["ssh_host"] is None
|
||||
|
||||
|
||||
def test_update_project_ssh_fields(conn):
|
||||
"""KIN-071: update_project can set SSH fields."""
|
||||
models.create_project(conn, "srv2", "Server 2", "", project_type="operations")
|
||||
updated = models.update_project(conn, "srv2", ssh_host="192.168.1.1", ssh_user="pelmen")
|
||||
assert updated["ssh_host"] == "192.168.1.1"
|
||||
assert updated["ssh_user"] == "pelmen"
|
||||
|
||||
|
||||
# -- validate_completion_mode (KIN-063) --
|
||||
|
||||
def test_validate_completion_mode_valid_auto_complete():
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ from core import models
|
|||
from agents.runner import (
|
||||
run_agent, run_pipeline, run_audit, _try_parse_json, _run_learning_extraction,
|
||||
_build_claude_env, _resolve_claude_cmd, _EXTRA_PATH_DIRS, _run_autocommit,
|
||||
_parse_agent_blocked,
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -1694,3 +1695,224 @@ class TestAuditLogDangerousSkip:
|
|||
"SELECT * FROM audit_log WHERE task_id='VDOL-001'"
|
||||
).fetchall()
|
||||
assert len(rows) == 0
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# KIN-016: Blocked Protocol
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestParseAgentBlocked:
|
||||
def test_returns_none_on_failure(self):
|
||||
result = {"success": False, "output": {"status": "blocked", "reason": "no access"}}
|
||||
assert _parse_agent_blocked(result) is None
|
||||
|
||||
def test_returns_none_when_output_not_dict(self):
|
||||
result = {"success": True, "output": "plain text output"}
|
||||
assert _parse_agent_blocked(result) is None
|
||||
|
||||
def test_returns_none_when_status_not_blocked(self):
|
||||
result = {"success": True, "output": {"status": "done", "changes": []}}
|
||||
assert _parse_agent_blocked(result) is None
|
||||
|
||||
def test_detects_status_blocked(self):
|
||||
result = {"success": True, "output": {"status": "blocked", "reason": "no file access"}}
|
||||
blocked = _parse_agent_blocked(result)
|
||||
assert blocked is not None
|
||||
assert blocked["reason"] == "no file access"
|
||||
assert blocked["blocked_at"] is not None
|
||||
|
||||
def test_detects_verdict_blocked(self):
|
||||
"""reviewer.md uses verdict: blocked instead of status: blocked."""
|
||||
result = {"success": True, "output": {"verdict": "blocked", "blocked_reason": "unreadable"}}
|
||||
blocked = _parse_agent_blocked(result)
|
||||
assert blocked is not None
|
||||
assert blocked["reason"] == "unreadable"
|
||||
|
||||
def test_uses_provided_blocked_at(self):
|
||||
result = {"success": True, "output": {
|
||||
"status": "blocked", "reason": "out of scope",
|
||||
"blocked_at": "2026-03-16T10:00:00",
|
||||
}}
|
||||
blocked = _parse_agent_blocked(result)
|
||||
assert blocked["blocked_at"] == "2026-03-16T10:00:00"
|
||||
|
||||
def test_falls_back_blocked_at_if_missing(self):
|
||||
result = {"success": True, "output": {"status": "blocked", "reason": "x"}}
|
||||
blocked = _parse_agent_blocked(result)
|
||||
assert "T" in blocked["blocked_at"] # ISO-8601 with T separator
|
||||
|
||||
def test_does_not_check_nested_status(self):
|
||||
"""Nested status='blocked' in sub-fields must NOT trigger blocked protocol."""
|
||||
result = {"success": True, "output": {
|
||||
"status": "done",
|
||||
"changes": [{"file": "a.py", "status": "blocked"}], # nested — must be ignored
|
||||
}}
|
||||
assert _parse_agent_blocked(result) is None
|
||||
|
||||
|
||||
class TestPipelineBlockedProtocol:
|
||||
@patch("agents.runner._run_autocommit")
|
||||
@patch("agents.runner.subprocess.run")
|
||||
def test_pipeline_stops_on_semantic_blocked(self, mock_run, mock_autocommit, conn):
|
||||
"""KIN-016: когда агент возвращает status='blocked', пайплайн останавливается."""
|
||||
# First step returns semantic blocked
|
||||
mock_run.return_value = _mock_claude_success({
|
||||
"result": json.dumps({"status": "blocked", "reason": "cannot access external API"}),
|
||||
})
|
||||
|
||||
steps = [
|
||||
{"role": "debugger", "brief": "find bug"},
|
||||
{"role": "tester", "brief": "verify"},
|
||||
]
|
||||
result = run_pipeline(conn, "VDOL-001", steps)
|
||||
|
||||
assert result["success"] is False
|
||||
assert result["steps_completed"] == 0
|
||||
assert "blocked" in result["error"]
|
||||
assert result["blocked_by"] == "debugger"
|
||||
assert result["blocked_reason"] == "cannot access external API"
|
||||
|
||||
# Task marked as blocked with enriched fields
|
||||
task = models.get_task(conn, "VDOL-001")
|
||||
assert task["status"] == "blocked"
|
||||
assert task["blocked_reason"] == "cannot access external API"
|
||||
assert task["blocked_agent_role"] == "debugger"
|
||||
assert task["blocked_pipeline_step"] == "1"
|
||||
assert task["blocked_at"] is not None
|
||||
|
||||
# Pipeline marked as failed
|
||||
pipe = conn.execute("SELECT * FROM pipelines WHERE task_id='VDOL-001'").fetchone()
|
||||
assert pipe["status"] == "failed"
|
||||
|
||||
@patch("agents.runner._run_autocommit")
|
||||
@patch("agents.runner.subprocess.run")
|
||||
def test_pipeline_blocks_on_second_step(self, mock_run, mock_autocommit, conn):
|
||||
"""KIN-016: blocked на шаге 2 → steps_completed=1, pipeline_step='2'."""
|
||||
mock_run.side_effect = [
|
||||
_mock_claude_success({"result": json.dumps({"status": "done", "changes": []})}),
|
||||
_mock_claude_success({"result": json.dumps({
|
||||
"status": "blocked", "reason": "test environment unavailable",
|
||||
})}),
|
||||
]
|
||||
|
||||
steps = [
|
||||
{"role": "backend_dev", "brief": "implement"},
|
||||
{"role": "tester", "brief": "test"},
|
||||
]
|
||||
result = run_pipeline(conn, "VDOL-001", steps)
|
||||
|
||||
assert result["success"] is False
|
||||
assert result["steps_completed"] == 1
|
||||
assert result["blocked_by"] == "tester"
|
||||
|
||||
task = models.get_task(conn, "VDOL-001")
|
||||
assert task["blocked_agent_role"] == "tester"
|
||||
assert task["blocked_pipeline_step"] == "2"
|
||||
|
||||
@patch("agents.runner._run_autocommit")
|
||||
@patch("agents.runner.subprocess.run")
|
||||
def test_reviewer_verdict_blocked_stops_pipeline(self, mock_run, mock_autocommit, conn):
|
||||
"""KIN-016: reviewer возвращает verdict='blocked' → пайплайн останавливается."""
|
||||
mock_run.return_value = _mock_claude_success({
|
||||
"result": json.dumps({
|
||||
"verdict": "blocked", "status": "blocked",
|
||||
"reason": "cannot read implementation files",
|
||||
}),
|
||||
})
|
||||
|
||||
steps = [{"role": "reviewer", "brief": "review"}]
|
||||
result = run_pipeline(conn, "VDOL-001", steps)
|
||||
|
||||
assert result["success"] is False
|
||||
assert result["blocked_by"] == "reviewer"
|
||||
|
||||
task = models.get_task(conn, "VDOL-001")
|
||||
assert task["status"] == "blocked"
|
||||
assert task["blocked_agent_role"] == "reviewer"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# KIN-071: _save_sysadmin_output
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestSaveSysadminOutput:
|
||||
"""KIN-071: _save_sysadmin_output парсит и сохраняет decisions + modules."""
|
||||
|
||||
@pytest.fixture
|
||||
def ops_conn(self):
|
||||
c = init_db(":memory:")
|
||||
models.create_project(
|
||||
c, "srv", "Server", "",
|
||||
project_type="operations",
|
||||
ssh_host="10.0.0.1",
|
||||
)
|
||||
models.create_task(c, "SRV-001", "srv", "Scan server")
|
||||
yield c
|
||||
c.close()
|
||||
|
||||
def test_saves_decisions_and_modules(self, ops_conn):
|
||||
"""KIN-071: sysadmin output корректно сохраняет decisions и modules."""
|
||||
from agents.runner import _save_sysadmin_output
|
||||
output = {
|
||||
"status": "done",
|
||||
"decisions": [
|
||||
{"type": "gotcha", "title": "Port 8080 open", "description": "nginx on 8080", "tags": ["server"]},
|
||||
{"type": "decision", "title": "Docker used", "description": "docker 24.0", "tags": ["docker"]},
|
||||
],
|
||||
"modules": [
|
||||
{"name": "nginx", "type": "service", "path": "/etc/nginx", "description": "web proxy"},
|
||||
],
|
||||
}
|
||||
result = _save_sysadmin_output(
|
||||
ops_conn, "srv", "SRV-001",
|
||||
{"raw_output": json.dumps(output)}
|
||||
)
|
||||
assert result["decisions_added"] == 2
|
||||
assert result["modules_added"] == 1
|
||||
|
||||
decisions = models.get_decisions(ops_conn, "srv")
|
||||
assert len(decisions) == 2
|
||||
modules = models.get_modules(ops_conn, "srv")
|
||||
assert len(modules) == 1
|
||||
assert modules[0]["name"] == "nginx"
|
||||
|
||||
def test_idempotent_on_duplicate_decisions(self, ops_conn):
|
||||
"""KIN-071: повторный вызов не создаёт дублей."""
|
||||
from agents.runner import _save_sysadmin_output
|
||||
output = {
|
||||
"decisions": [
|
||||
{"type": "gotcha", "title": "Port 8080 open", "description": "nginx on 8080"},
|
||||
],
|
||||
"modules": [],
|
||||
}
|
||||
r1 = _save_sysadmin_output(ops_conn, "srv", "SRV-001", {"raw_output": json.dumps(output)})
|
||||
r2 = _save_sysadmin_output(ops_conn, "srv", "SRV-001", {"raw_output": json.dumps(output)})
|
||||
assert r1["decisions_added"] == 1
|
||||
assert r2["decisions_added"] == 0 # deduped
|
||||
assert r2["decisions_skipped"] == 1
|
||||
|
||||
def test_idempotent_on_duplicate_modules(self, ops_conn):
|
||||
"""KIN-071: повторный вызов не создаёт дублей модулей."""
|
||||
from agents.runner import _save_sysadmin_output
|
||||
output = {
|
||||
"decisions": [],
|
||||
"modules": [{"name": "nginx", "type": "service", "path": "/etc/nginx"}],
|
||||
}
|
||||
r1 = _save_sysadmin_output(ops_conn, "srv", "SRV-001", {"raw_output": json.dumps(output)})
|
||||
r2 = _save_sysadmin_output(ops_conn, "srv", "SRV-001", {"raw_output": json.dumps(output)})
|
||||
assert r1["modules_added"] == 1
|
||||
assert r2["modules_skipped"] == 1
|
||||
assert len(models.get_modules(ops_conn, "srv")) == 1
|
||||
|
||||
def test_handles_non_json_output(self, ops_conn):
|
||||
"""KIN-071: не-JSON вывод не вызывает исключения."""
|
||||
from agents.runner import _save_sysadmin_output
|
||||
result = _save_sysadmin_output(ops_conn, "srv", "SRV-001", {"raw_output": "not json"})
|
||||
assert result["decisions_added"] == 0
|
||||
assert result["modules_added"] == 0
|
||||
|
||||
def test_handles_empty_output(self, ops_conn):
|
||||
"""KIN-071: пустой вывод не вызывает исключения."""
|
||||
from agents.runner import _save_sysadmin_output
|
||||
result = _save_sysadmin_output(ops_conn, "srv", "SRV-001", {"raw_output": ""})
|
||||
assert result["decisions_added"] == 0
|
||||
|
|
|
|||
212
web/api.py
212
web/api.py
|
|
@ -112,6 +112,44 @@ def list_projects(status: str | None = None):
|
|||
return summary
|
||||
|
||||
|
||||
class NewProjectCreate(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
path: str
|
||||
description: str
|
||||
roles: list[str]
|
||||
tech_stack: list[str] | None = None
|
||||
priority: int = 5
|
||||
language: str = "ru"
|
||||
|
||||
|
||||
@app.post("/api/projects/new")
|
||||
def new_project_with_phases(body: NewProjectCreate):
|
||||
"""Create project + sequential research phases (KIN-059)."""
|
||||
from core.phases import create_project_with_phases, validate_roles
|
||||
clean_roles = validate_roles(body.roles)
|
||||
if not clean_roles:
|
||||
raise HTTPException(400, "At least one research role must be selected (excluding architect)")
|
||||
conn = get_conn()
|
||||
if models.get_project(conn, body.id):
|
||||
conn.close()
|
||||
raise HTTPException(409, f"Project '{body.id}' already exists")
|
||||
try:
|
||||
result = create_project_with_phases(
|
||||
conn, body.id, body.name, body.path,
|
||||
description=body.description,
|
||||
selected_roles=clean_roles,
|
||||
tech_stack=body.tech_stack,
|
||||
priority=body.priority,
|
||||
language=body.language,
|
||||
)
|
||||
except ValueError as e:
|
||||
conn.close()
|
||||
raise HTTPException(400, str(e))
|
||||
conn.close()
|
||||
return result
|
||||
|
||||
|
||||
@app.get("/api/projects/{project_id}")
|
||||
def get_project(project_id: str):
|
||||
conn = get_conn()
|
||||
|
|
@ -126,13 +164,21 @@ def get_project(project_id: str):
|
|||
return {**p, "tasks": tasks, "modules": mods, "decisions": decisions}
|
||||
|
||||
|
||||
VALID_PROJECT_TYPES = {"development", "operations", "research"}
|
||||
|
||||
|
||||
class ProjectCreate(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
path: str
|
||||
path: str = ""
|
||||
tech_stack: list[str] | None = None
|
||||
status: str = "active"
|
||||
priority: int = 5
|
||||
project_type: str = "development"
|
||||
ssh_host: str | None = None
|
||||
ssh_user: str | None = None
|
||||
ssh_key_path: str | None = None
|
||||
ssh_proxy_jump: str | None = None
|
||||
|
||||
|
||||
class ProjectPatch(BaseModel):
|
||||
|
|
@ -140,14 +186,28 @@ class ProjectPatch(BaseModel):
|
|||
autocommit_enabled: bool | None = None
|
||||
obsidian_vault_path: str | None = None
|
||||
deploy_command: str | None = None
|
||||
project_type: str | None = None
|
||||
ssh_host: str | None = None
|
||||
ssh_user: str | None = None
|
||||
ssh_key_path: str | None = None
|
||||
ssh_proxy_jump: str | None = None
|
||||
|
||||
|
||||
@app.patch("/api/projects/{project_id}")
|
||||
def patch_project(project_id: str, body: ProjectPatch):
|
||||
if body.execution_mode is None and body.autocommit_enabled is None and body.obsidian_vault_path is None and body.deploy_command is None:
|
||||
raise HTTPException(400, "Nothing to update. Provide execution_mode, autocommit_enabled, obsidian_vault_path, or deploy_command.")
|
||||
has_any = any([
|
||||
body.execution_mode, body.autocommit_enabled is not None,
|
||||
body.obsidian_vault_path, body.deploy_command is not None,
|
||||
body.project_type, body.ssh_host is not None,
|
||||
body.ssh_user is not None, body.ssh_key_path is not None,
|
||||
body.ssh_proxy_jump is not None,
|
||||
])
|
||||
if not has_any:
|
||||
raise HTTPException(400, "Nothing to update.")
|
||||
if body.execution_mode is not None and body.execution_mode not in VALID_EXECUTION_MODES:
|
||||
raise HTTPException(400, f"Invalid execution_mode '{body.execution_mode}'. Must be one of: {', '.join(VALID_EXECUTION_MODES)}")
|
||||
if body.project_type is not None and body.project_type not in VALID_PROJECT_TYPES:
|
||||
raise HTTPException(400, f"Invalid project_type '{body.project_type}'. Must be one of: {', '.join(VALID_PROJECT_TYPES)}")
|
||||
conn = get_conn()
|
||||
p = models.get_project(conn, project_id)
|
||||
if not p:
|
||||
|
|
@ -163,6 +223,16 @@ def patch_project(project_id: str, body: ProjectPatch):
|
|||
if body.deploy_command is not None:
|
||||
# Empty string = sentinel for clearing (decision #68)
|
||||
fields["deploy_command"] = None if body.deploy_command == "" else body.deploy_command
|
||||
if body.project_type is not None:
|
||||
fields["project_type"] = body.project_type
|
||||
if body.ssh_host is not None:
|
||||
fields["ssh_host"] = body.ssh_host
|
||||
if body.ssh_user is not None:
|
||||
fields["ssh_user"] = body.ssh_user
|
||||
if body.ssh_key_path is not None:
|
||||
fields["ssh_key_path"] = body.ssh_key_path
|
||||
if body.ssh_proxy_jump is not None:
|
||||
fields["ssh_proxy_jump"] = body.ssh_proxy_jump
|
||||
models.update_project(conn, project_id, **fields)
|
||||
p = models.get_project(conn, project_id)
|
||||
conn.close()
|
||||
|
|
@ -229,6 +299,8 @@ def deploy_project(project_id: str):
|
|||
|
||||
@app.post("/api/projects")
|
||||
def create_project(body: ProjectCreate):
|
||||
if body.project_type not in VALID_PROJECT_TYPES:
|
||||
raise HTTPException(400, f"Invalid project_type '{body.project_type}'. Must be one of: {', '.join(VALID_PROJECT_TYPES)}")
|
||||
conn = get_conn()
|
||||
if models.get_project(conn, body.id):
|
||||
conn.close()
|
||||
|
|
@ -236,11 +308,105 @@ def create_project(body: ProjectCreate):
|
|||
p = models.create_project(
|
||||
conn, body.id, body.name, body.path,
|
||||
tech_stack=body.tech_stack, status=body.status, priority=body.priority,
|
||||
project_type=body.project_type,
|
||||
ssh_host=body.ssh_host,
|
||||
ssh_user=body.ssh_user,
|
||||
ssh_key_path=body.ssh_key_path,
|
||||
ssh_proxy_jump=body.ssh_proxy_jump,
|
||||
)
|
||||
conn.close()
|
||||
return p
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Phases (KIN-059)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@app.get("/api/projects/{project_id}/phases")
|
||||
def get_project_phases(project_id: str):
|
||||
"""List research phases for a project, with task data joined."""
|
||||
conn = get_conn()
|
||||
p = models.get_project(conn, project_id)
|
||||
if not p:
|
||||
conn.close()
|
||||
raise HTTPException(404, f"Project '{project_id}' not found")
|
||||
phases = models.list_phases(conn, project_id)
|
||||
result = []
|
||||
for phase in phases:
|
||||
task = models.get_task(conn, phase["task_id"]) if phase.get("task_id") else None
|
||||
result.append({**phase, "task": task})
|
||||
conn.close()
|
||||
return result
|
||||
|
||||
|
||||
class PhaseApprove(BaseModel):
|
||||
comment: str | None = None
|
||||
|
||||
|
||||
class PhaseReject(BaseModel):
|
||||
reason: str
|
||||
|
||||
|
||||
class PhaseRevise(BaseModel):
|
||||
comment: str
|
||||
|
||||
|
||||
@app.post("/api/phases/{phase_id}/approve")
|
||||
def approve_phase(phase_id: int, body: PhaseApprove | None = None):
|
||||
"""Approve a research phase and activate the next one."""
|
||||
from core.phases import approve_phase as _approve
|
||||
conn = get_conn()
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
conn.close()
|
||||
raise HTTPException(404, f"Phase {phase_id} not found")
|
||||
try:
|
||||
result = _approve(conn, phase_id)
|
||||
except ValueError as e:
|
||||
conn.close()
|
||||
raise HTTPException(400, str(e))
|
||||
conn.close()
|
||||
return result
|
||||
|
||||
|
||||
@app.post("/api/phases/{phase_id}/reject")
|
||||
def reject_phase(phase_id: int, body: PhaseReject):
|
||||
"""Reject a research phase."""
|
||||
from core.phases import reject_phase as _reject
|
||||
conn = get_conn()
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
conn.close()
|
||||
raise HTTPException(404, f"Phase {phase_id} not found")
|
||||
try:
|
||||
result = _reject(conn, phase_id, body.reason)
|
||||
except ValueError as e:
|
||||
conn.close()
|
||||
raise HTTPException(400, str(e))
|
||||
conn.close()
|
||||
return result
|
||||
|
||||
|
||||
@app.post("/api/phases/{phase_id}/revise")
|
||||
def revise_phase(phase_id: int, body: PhaseRevise):
|
||||
"""Request revision for a research phase."""
|
||||
from core.phases import revise_phase as _revise
|
||||
if not body.comment.strip():
|
||||
raise HTTPException(400, "comment is required")
|
||||
conn = get_conn()
|
||||
phase = models.get_phase(conn, phase_id)
|
||||
if not phase:
|
||||
conn.close()
|
||||
raise HTTPException(404, f"Phase {phase_id} not found")
|
||||
try:
|
||||
result = _revise(conn, phase_id, body.comment)
|
||||
except ValueError as e:
|
||||
conn.close()
|
||||
raise HTTPException(400, str(e))
|
||||
conn.close()
|
||||
return result
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Tasks
|
||||
# ---------------------------------------------------------------------------
|
||||
|
|
@ -716,6 +882,46 @@ def bootstrap(body: BootstrapRequest):
|
|||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Notifications (escalations from blocked agents)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@app.get("/api/notifications")
|
||||
def get_notifications(project_id: str | None = None):
|
||||
"""Return tasks with status='blocked' as escalation notifications.
|
||||
|
||||
Each item includes task details, the agent role that blocked it,
|
||||
the reason, and the pipeline step. Intended for GUI polling (5s interval).
|
||||
|
||||
TODO: Telegram — send notification on new escalation (telegram_sent: false placeholder).
|
||||
"""
|
||||
conn = get_conn()
|
||||
query = "SELECT * FROM tasks WHERE status = 'blocked'"
|
||||
params: list = []
|
||||
if project_id:
|
||||
query += " AND project_id = ?"
|
||||
params.append(project_id)
|
||||
query += " ORDER BY blocked_at DESC, updated_at DESC"
|
||||
rows = conn.execute(query, params).fetchall()
|
||||
conn.close()
|
||||
|
||||
notifications = []
|
||||
for row in rows:
|
||||
t = dict(row)
|
||||
notifications.append({
|
||||
"task_id": t["id"],
|
||||
"project_id": t["project_id"],
|
||||
"title": t.get("title"),
|
||||
"agent_role": t.get("blocked_agent_role"),
|
||||
"reason": t.get("blocked_reason"),
|
||||
"pipeline_step": t.get("blocked_pipeline_step"),
|
||||
"blocked_at": t.get("blocked_at") or t.get("updated_at"),
|
||||
# TODO: Telegram — set to True once notification is sent via Telegram bot
|
||||
"telegram_sent": False,
|
||||
})
|
||||
return notifications
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# SPA static files (AFTER all /api/ routes)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
<script setup lang="ts">
|
||||
import EscalationBanner from './components/EscalationBanner.vue'
|
||||
</script>
|
||||
|
||||
<template>
|
||||
|
|
@ -8,6 +9,7 @@
|
|||
Kin
|
||||
</router-link>
|
||||
<nav class="flex items-center gap-4">
|
||||
<EscalationBanner />
|
||||
<router-link to="/settings" class="text-xs text-gray-400 hover:text-gray-200 no-underline">Settings</router-link>
|
||||
<span class="text-xs text-gray-600">multi-agent orchestrator</span>
|
||||
</nav>
|
||||
|
|
|
|||
|
|
@ -49,6 +49,12 @@ export interface Project {
|
|||
active_tasks: number
|
||||
blocked_tasks: number
|
||||
review_tasks: number
|
||||
project_type: string | null
|
||||
ssh_host: string | null
|
||||
ssh_user: string | null
|
||||
ssh_key_path: string | null
|
||||
ssh_proxy_jump: string | null
|
||||
description: string | null
|
||||
}
|
||||
|
||||
export interface ObsidianSyncResult {
|
||||
|
|
@ -148,6 +154,40 @@ export interface CostEntry {
|
|||
total_duration_seconds: number
|
||||
}
|
||||
|
||||
export interface Phase {
|
||||
id: number
|
||||
project_id: string
|
||||
role: string
|
||||
phase_order: number
|
||||
status: string
|
||||
task_id: string | null
|
||||
revise_count: number
|
||||
created_at: string
|
||||
updated_at: string
|
||||
task?: Task | null
|
||||
}
|
||||
|
||||
export interface NewProjectPayload {
|
||||
id: string
|
||||
name: string
|
||||
path: string
|
||||
description: string
|
||||
roles: string[]
|
||||
tech_stack?: string[]
|
||||
priority?: number
|
||||
language?: string
|
||||
project_type?: string
|
||||
ssh_host?: string
|
||||
ssh_user?: string
|
||||
ssh_key_path?: string
|
||||
ssh_proxy_jump?: string
|
||||
}
|
||||
|
||||
export interface NewProjectResult {
|
||||
project: Project
|
||||
phases: Phase[]
|
||||
}
|
||||
|
||||
export interface AuditItem {
|
||||
id: string
|
||||
reason: string
|
||||
|
|
@ -163,6 +203,16 @@ export interface AuditResult {
|
|||
error?: string
|
||||
}
|
||||
|
||||
export interface EscalationNotification {
|
||||
task_id: string
|
||||
project_id: string
|
||||
agent_role: string
|
||||
reason: string
|
||||
pipeline_step: string | null
|
||||
blocked_at: string
|
||||
telegram_sent: boolean
|
||||
}
|
||||
|
||||
export const api = {
|
||||
projects: () => get<Project[]>('/projects'),
|
||||
project: (id: string) => get<ProjectDetail>(`/projects/${id}`),
|
||||
|
|
@ -170,7 +220,7 @@ export const api = {
|
|||
taskFull: (id: string) => get<TaskFull>(`/tasks/${id}/full`),
|
||||
taskPipeline: (id: string) => get<PipelineStep[]>(`/tasks/${id}/pipeline`),
|
||||
cost: (days = 7) => get<CostEntry[]>(`/cost?days=${days}`),
|
||||
createProject: (data: { id: string; name: string; path: string; tech_stack?: string[]; priority?: number }) =>
|
||||
createProject: (data: { id: string; name: string; path?: string; tech_stack?: string[]; priority?: number; project_type?: string; ssh_host?: string; ssh_user?: string; ssh_key_path?: string; ssh_proxy_jump?: string }) =>
|
||||
post<Project>('/projects', data),
|
||||
createTask: (data: { project_id: string; title: string; priority?: number; route_type?: string; category?: string }) =>
|
||||
post<Task>('/tasks', data),
|
||||
|
|
@ -192,7 +242,7 @@ export const api = {
|
|||
post<{ updated: string[]; count: number }>(`/projects/${projectId}/audit/apply`, { task_ids: taskIds }),
|
||||
patchTask: (id: string, data: { status?: string; execution_mode?: string; priority?: number; route_type?: string; title?: string; brief_text?: string }) =>
|
||||
patch<Task>(`/tasks/${id}`, data),
|
||||
patchProject: (id: string, data: { execution_mode?: string; autocommit_enabled?: boolean; obsidian_vault_path?: string; deploy_command?: string }) =>
|
||||
patchProject: (id: string, data: { execution_mode?: string; autocommit_enabled?: boolean; obsidian_vault_path?: string; deploy_command?: string; project_type?: string; ssh_host?: string; ssh_user?: string; ssh_key_path?: string; ssh_proxy_jump?: string }) =>
|
||||
patch<Project>(`/projects/${id}`, data),
|
||||
deployProject: (projectId: string) =>
|
||||
post<DeployResult>(`/projects/${projectId}/deploy`, {}),
|
||||
|
|
@ -202,4 +252,16 @@ export const api = {
|
|||
del<{ deleted: number }>(`/projects/${projectId}/decisions/${decisionId}`),
|
||||
createDecision: (data: { project_id: string; type: string; title: string; description: string; category?: string; tags?: string[] }) =>
|
||||
post<Decision>('/decisions', data),
|
||||
newProject: (data: NewProjectPayload) =>
|
||||
post<NewProjectResult>('/projects/new', data),
|
||||
getPhases: (projectId: string) =>
|
||||
get<Phase[]>(`/projects/${projectId}/phases`),
|
||||
notifications: (projectId?: string) =>
|
||||
get<EscalationNotification[]>(`/notifications${projectId ? `?project_id=${projectId}` : ''}`),
|
||||
approvePhase: (phaseId: number, comment?: string) =>
|
||||
post<{ phase: Phase; next_phase: Phase | null }>(`/phases/${phaseId}/approve`, { comment }),
|
||||
rejectPhase: (phaseId: number, reason: string) =>
|
||||
post<Phase>(`/phases/${phaseId}/reject`, { reason }),
|
||||
revisePhase: (phaseId: number, comment: string) =>
|
||||
post<{ phase: Phase; new_task: Task }>(`/phases/${phaseId}/revise`, { comment }),
|
||||
}
|
||||
|
|
|
|||
127
web/frontend/src/components/EscalationBanner.vue
Normal file
127
web/frontend/src/components/EscalationBanner.vue
Normal file
|
|
@ -0,0 +1,127 @@
|
|||
<script setup lang="ts">
|
||||
import { ref, computed, onMounted, onUnmounted } from 'vue'
|
||||
import { api, type EscalationNotification } from '../api'
|
||||
|
||||
const STORAGE_KEY = 'kin_dismissed_escalations'
|
||||
|
||||
const notifications = ref<EscalationNotification[]>([])
|
||||
const showPanel = ref(false)
|
||||
let pollTimer: ReturnType<typeof setInterval> | null = null
|
||||
|
||||
function loadDismissed(): Set<string> {
|
||||
try {
|
||||
const raw = localStorage.getItem(STORAGE_KEY)
|
||||
return new Set(raw ? JSON.parse(raw) : [])
|
||||
} catch {
|
||||
return new Set()
|
||||
}
|
||||
}
|
||||
|
||||
function saveDismissed(ids: Set<string>) {
|
||||
localStorage.setItem(STORAGE_KEY, JSON.stringify([...ids]))
|
||||
}
|
||||
|
||||
const dismissed = ref<Set<string>>(loadDismissed())
|
||||
|
||||
const visible = computed(() =>
|
||||
notifications.value.filter(n => !dismissed.value.has(n.task_id))
|
||||
)
|
||||
|
||||
async function load() {
|
||||
try {
|
||||
notifications.value = await api.notifications()
|
||||
} catch {
|
||||
// silent — не ломаем layout при недоступном endpoint
|
||||
}
|
||||
}
|
||||
|
||||
function dismiss(taskId: string) {
|
||||
dismissed.value = new Set([...dismissed.value, taskId])
|
||||
saveDismissed(dismissed.value)
|
||||
if (visible.value.length === 0) showPanel.value = false
|
||||
}
|
||||
|
||||
function dismissAll() {
|
||||
const newSet = new Set([...dismissed.value, ...visible.value.map(n => n.task_id)])
|
||||
dismissed.value = newSet
|
||||
saveDismissed(newSet)
|
||||
showPanel.value = false
|
||||
}
|
||||
|
||||
function formatTime(iso: string): string {
|
||||
try {
|
||||
return new Date(iso).toLocaleString('ru-RU', { day: '2-digit', month: '2-digit', hour: '2-digit', minute: '2-digit' })
|
||||
} catch {
|
||||
return iso
|
||||
}
|
||||
}
|
||||
|
||||
onMounted(async () => {
|
||||
await load()
|
||||
pollTimer = setInterval(load, 10000)
|
||||
})
|
||||
|
||||
onUnmounted(() => {
|
||||
if (pollTimer) clearInterval(pollTimer)
|
||||
})
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<div class="relative">
|
||||
<!-- Badge-кнопка — видна только при наличии активных эскалаций -->
|
||||
<button
|
||||
v-if="visible.length > 0"
|
||||
@click="showPanel = !showPanel"
|
||||
class="relative flex items-center gap-1.5 px-2.5 py-1 text-xs bg-red-900/50 text-red-400 border border-red-800 rounded hover:bg-red-900 transition-colors"
|
||||
>
|
||||
<span class="inline-block w-1.5 h-1.5 bg-red-500 rounded-full animate-pulse"></span>
|
||||
Эскалации
|
||||
<span class="ml-0.5 font-bold">{{ visible.length }}</span>
|
||||
</button>
|
||||
|
||||
<!-- Панель уведомлений -->
|
||||
<div
|
||||
v-if="showPanel && visible.length > 0"
|
||||
class="absolute right-0 top-full mt-2 w-96 bg-gray-900 border border-red-900/60 rounded-lg shadow-2xl z-50"
|
||||
>
|
||||
<div class="flex items-center justify-between px-4 py-2.5 border-b border-gray-800">
|
||||
<span class="text-xs font-semibold text-red-400">Эскалации — требуется решение</span>
|
||||
<div class="flex items-center gap-2">
|
||||
<button
|
||||
@click="dismissAll"
|
||||
class="text-xs text-gray-500 hover:text-gray-300"
|
||||
>Принять все</button>
|
||||
<button @click="showPanel = false" class="text-gray-500 hover:text-gray-300 text-lg leading-none">×</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="max-h-80 overflow-y-auto divide-y divide-gray-800">
|
||||
<div
|
||||
v-for="n in visible"
|
||||
:key="n.task_id"
|
||||
class="px-4 py-3"
|
||||
>
|
||||
<div class="flex items-start justify-between gap-2">
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-1.5 mb-1">
|
||||
<span class="text-xs font-mono text-red-400 shrink-0">{{ n.task_id }}</span>
|
||||
<span class="text-xs text-gray-500">·</span>
|
||||
<span class="text-xs text-orange-400 shrink-0">{{ n.agent_role }}</span>
|
||||
<span v-if="n.pipeline_step" class="text-xs text-gray-600 truncate">@ {{ n.pipeline_step }}</span>
|
||||
</div>
|
||||
<p class="text-xs text-gray-300 leading-snug break-words">{{ n.reason }}</p>
|
||||
<p class="text-xs text-gray-600 mt-1">{{ formatTime(n.blocked_at) }}</p>
|
||||
</div>
|
||||
<button
|
||||
@click="dismiss(n.task_id)"
|
||||
class="shrink-0 px-2 py-1 text-xs bg-gray-800 text-gray-400 border border-gray-700 rounded hover:bg-gray-700 hover:text-gray-200"
|
||||
>Принято</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Overlay для закрытия панели -->
|
||||
<div v-if="showPanel" class="fixed inset-0 z-40" @click="showPanel = false"></div>
|
||||
</div>
|
||||
</template>
|
||||
|
|
@ -11,7 +11,10 @@ const error = ref('')
|
|||
|
||||
// Add project modal
|
||||
const showAdd = ref(false)
|
||||
const form = ref({ id: '', name: '', path: '', tech_stack: '', priority: 5 })
|
||||
const form = ref({
|
||||
id: '', name: '', path: '', tech_stack: '', priority: 5,
|
||||
project_type: 'development', ssh_host: '', ssh_user: 'root', ssh_key_path: '', ssh_proxy_jump: '',
|
||||
})
|
||||
const formError = ref('')
|
||||
|
||||
// Bootstrap modal
|
||||
|
|
@ -20,6 +23,23 @@ const bsForm = ref({ id: '', name: '', path: '' })
|
|||
const bsError = ref('')
|
||||
const bsResult = ref('')
|
||||
|
||||
// New Project with Research modal
|
||||
const RESEARCH_ROLES = [
|
||||
{ key: 'business_analyst', label: 'Business Analyst', hint: 'бизнес-модель, аудитория, монетизация' },
|
||||
{ key: 'market_researcher', label: 'Market Researcher', hint: 'конкуренты, ниша, сильные/слабые стороны' },
|
||||
{ key: 'legal_researcher', label: 'Legal Researcher', hint: 'юрисдикция, лицензии, KYC/AML, GDPR' },
|
||||
{ key: 'tech_researcher', label: 'Tech Researcher', hint: 'API, ограничения, стоимость, альтернативы' },
|
||||
{ key: 'ux_designer', label: 'UX Designer', hint: 'анализ UX конкурентов, user journey, wireframes' },
|
||||
{ key: 'marketer', label: 'Marketer', hint: 'стратегия продвижения, SEO, conversion-паттерны' },
|
||||
]
|
||||
const showNewProject = ref(false)
|
||||
const npForm = ref({
|
||||
id: '', name: '', path: '', description: '', tech_stack: '', priority: 5, language: 'ru',
|
||||
})
|
||||
const npRoles = ref<string[]>(['business_analyst', 'market_researcher', 'tech_researcher'])
|
||||
const npError = ref('')
|
||||
const npSaving = ref(false)
|
||||
|
||||
async function load() {
|
||||
try {
|
||||
loading.value = true
|
||||
|
|
@ -66,11 +86,35 @@ function statusColor(s: string) {
|
|||
|
||||
async function addProject() {
|
||||
formError.value = ''
|
||||
if (form.value.project_type === 'operations' && !form.value.ssh_host) {
|
||||
formError.value = 'SSH host is required for operations projects'
|
||||
return
|
||||
}
|
||||
if (form.value.project_type !== 'operations' && !form.value.path) {
|
||||
formError.value = 'Path is required'
|
||||
return
|
||||
}
|
||||
try {
|
||||
const ts = form.value.tech_stack ? form.value.tech_stack.split(',').map(s => s.trim()).filter(Boolean) : undefined
|
||||
await api.createProject({ ...form.value, tech_stack: ts, priority: form.value.priority })
|
||||
const payload: Parameters<typeof api.createProject>[0] = {
|
||||
id: form.value.id,
|
||||
name: form.value.name,
|
||||
tech_stack: ts,
|
||||
priority: form.value.priority,
|
||||
project_type: form.value.project_type,
|
||||
}
|
||||
if (form.value.project_type !== 'operations') {
|
||||
payload.path = form.value.path
|
||||
} else {
|
||||
payload.path = ''
|
||||
if (form.value.ssh_host) payload.ssh_host = form.value.ssh_host
|
||||
if (form.value.ssh_user) payload.ssh_user = form.value.ssh_user
|
||||
if (form.value.ssh_key_path) payload.ssh_key_path = form.value.ssh_key_path
|
||||
if (form.value.ssh_proxy_jump) payload.ssh_proxy_jump = form.value.ssh_proxy_jump
|
||||
}
|
||||
await api.createProject(payload)
|
||||
showAdd.value = false
|
||||
form.value = { id: '', name: '', path: '', tech_stack: '', priority: 5 }
|
||||
form.value = { id: '', name: '', path: '', tech_stack: '', priority: 5, project_type: 'development', ssh_host: '', ssh_user: 'root', ssh_key_path: '', ssh_proxy_jump: '' }
|
||||
await load()
|
||||
} catch (e: any) {
|
||||
formError.value = e.message
|
||||
|
|
@ -88,6 +132,42 @@ async function runBootstrap() {
|
|||
bsError.value = e.message
|
||||
}
|
||||
}
|
||||
|
||||
function toggleNpRole(key: string) {
|
||||
const idx = npRoles.value.indexOf(key)
|
||||
if (idx >= 0) npRoles.value.splice(idx, 1)
|
||||
else npRoles.value.push(key)
|
||||
}
|
||||
|
||||
async function createNewProject() {
|
||||
npError.value = ''
|
||||
if (!npRoles.value.length) {
|
||||
npError.value = 'Выберите хотя бы одну роль'
|
||||
return
|
||||
}
|
||||
npSaving.value = true
|
||||
try {
|
||||
const ts = npForm.value.tech_stack ? npForm.value.tech_stack.split(',').map(s => s.trim()).filter(Boolean) : undefined
|
||||
await api.newProject({
|
||||
id: npForm.value.id,
|
||||
name: npForm.value.name,
|
||||
path: npForm.value.path,
|
||||
description: npForm.value.description,
|
||||
roles: npRoles.value,
|
||||
tech_stack: ts,
|
||||
priority: npForm.value.priority,
|
||||
language: npForm.value.language,
|
||||
})
|
||||
showNewProject.value = false
|
||||
npForm.value = { id: '', name: '', path: '', description: '', tech_stack: '', priority: 5, language: 'ru' }
|
||||
npRoles.value = ['business_analyst', 'market_researcher', 'tech_researcher']
|
||||
await load()
|
||||
} catch (e: any) {
|
||||
npError.value = e.message
|
||||
} finally {
|
||||
npSaving.value = false
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
|
|
@ -102,9 +182,13 @@ async function runBootstrap() {
|
|||
class="px-3 py-1.5 text-xs bg-purple-900/50 text-purple-400 border border-purple-800 rounded hover:bg-purple-900">
|
||||
Bootstrap
|
||||
</button>
|
||||
<button @click="showNewProject = true"
|
||||
class="px-3 py-1.5 text-xs bg-green-900/50 text-green-400 border border-green-800 rounded hover:bg-green-900">
|
||||
+ New Project
|
||||
</button>
|
||||
<button @click="showAdd = true"
|
||||
class="px-3 py-1.5 text-xs bg-gray-800 text-gray-300 border border-gray-700 rounded hover:bg-gray-700">
|
||||
+ Project
|
||||
+ Blank
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
|
@ -122,6 +206,9 @@ async function runBootstrap() {
|
|||
<div class="flex items-center gap-2">
|
||||
<span class="text-sm font-semibold text-gray-200">{{ p.id }}</span>
|
||||
<Badge :text="p.status" :color="statusColor(p.status)" />
|
||||
<Badge v-if="p.project_type && p.project_type !== 'development'"
|
||||
:text="p.project_type"
|
||||
:color="p.project_type === 'operations' ? 'orange' : 'green'" />
|
||||
<span class="text-sm text-gray-400">{{ p.name }}</span>
|
||||
</div>
|
||||
<div class="flex items-center gap-3 text-xs text-gray-500">
|
||||
|
|
@ -152,8 +239,39 @@ async function runBootstrap() {
|
|||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<input v-model="form.name" placeholder="Name" required
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<input v-model="form.path" placeholder="Path (e.g. ~/projects/myproj)" required
|
||||
<!-- Project type selector -->
|
||||
<div>
|
||||
<p class="text-xs text-gray-500 mb-1.5">Тип проекта:</p>
|
||||
<div class="flex gap-2">
|
||||
<button v-for="t in ['development', 'operations', 'research']" :key="t"
|
||||
type="button"
|
||||
@click="form.project_type = t"
|
||||
class="flex-1 py-1.5 text-xs border rounded transition-colors"
|
||||
:class="form.project_type === t
|
||||
? t === 'development' ? 'bg-blue-900/40 text-blue-300 border-blue-700'
|
||||
: t === 'operations' ? 'bg-orange-900/40 text-orange-300 border-orange-700'
|
||||
: 'bg-green-900/40 text-green-300 border-green-700'
|
||||
: 'bg-gray-900 text-gray-500 border-gray-800 hover:text-gray-300 hover:border-gray-600'"
|
||||
>{{ t }}</button>
|
||||
</div>
|
||||
</div>
|
||||
<!-- Path (development / research) -->
|
||||
<input v-if="form.project_type !== 'operations'"
|
||||
v-model="form.path" placeholder="Path (e.g. ~/projects/myproj)"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<!-- SSH fields (operations) -->
|
||||
<template v-if="form.project_type === 'operations'">
|
||||
<input v-model="form.ssh_host" placeholder="SSH host (e.g. 192.168.1.1)" required
|
||||
class="w-full bg-gray-800 border border-orange-800/60 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<div class="grid grid-cols-2 gap-2">
|
||||
<input v-model="form.ssh_user" placeholder="SSH user (e.g. root)"
|
||||
class="bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<input v-model="form.ssh_key_path" placeholder="Key path (e.g. ~/.ssh/id_rsa)"
|
||||
class="bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
</div>
|
||||
<input v-model="form.ssh_proxy_jump" placeholder="ProxyJump (optional, e.g. jumpt)"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
</template>
|
||||
<input v-model="form.tech_stack" placeholder="Tech stack (comma-separated)"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<input v-model.number="form.priority" type="number" min="1" max="10" placeholder="Priority (1-10)"
|
||||
|
|
@ -166,6 +284,52 @@ async function runBootstrap() {
|
|||
</form>
|
||||
</Modal>
|
||||
|
||||
<!-- New Project with Research Modal -->
|
||||
<Modal v-if="showNewProject" title="New Project — Start Research" @close="showNewProject = false">
|
||||
<form @submit.prevent="createNewProject" class="space-y-3">
|
||||
<div class="grid grid-cols-2 gap-2">
|
||||
<input v-model="npForm.id" placeholder="ID (e.g. myapp)" required
|
||||
class="bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<input v-model="npForm.name" placeholder="Name" required
|
||||
class="bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
</div>
|
||||
<input v-model="npForm.path" placeholder="Path (e.g. ~/projects/myapp)"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<textarea v-model="npForm.description" placeholder="Описание проекта (свободный текст для агентов)" required rows="4"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600 resize-none"></textarea>
|
||||
<input v-model="npForm.tech_stack" placeholder="Tech stack (comma-separated, optional)"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600" />
|
||||
<div>
|
||||
<p class="text-xs text-gray-500 mb-2">Этапы research (Architect добавляется автоматически последним):</p>
|
||||
<div class="space-y-1.5">
|
||||
<label v-for="r in RESEARCH_ROLES" :key="r.key"
|
||||
class="flex items-start gap-2 cursor-pointer group">
|
||||
<input type="checkbox"
|
||||
:checked="npRoles.includes(r.key)"
|
||||
@change="toggleNpRole(r.key)"
|
||||
class="mt-0.5 accent-green-500 cursor-pointer" />
|
||||
<div>
|
||||
<span class="text-sm text-gray-300 group-hover:text-gray-100">{{ r.label }}</span>
|
||||
<span class="text-xs text-gray-600 ml-1">— {{ r.hint }}</span>
|
||||
</div>
|
||||
</label>
|
||||
<label class="flex items-start gap-2 opacity-50">
|
||||
<input type="checkbox" checked disabled class="mt-0.5" />
|
||||
<div>
|
||||
<span class="text-sm text-gray-400">Architect</span>
|
||||
<span class="text-xs text-gray-600 ml-1">— blueprint на основе одобренных исследований</span>
|
||||
</div>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
<p v-if="npError" class="text-red-400 text-xs">{{ npError }}</p>
|
||||
<button type="submit" :disabled="npSaving"
|
||||
class="w-full py-2 bg-green-900/50 text-green-400 border border-green-800 rounded text-sm hover:bg-green-900 disabled:opacity-50">
|
||||
{{ npSaving ? 'Starting...' : 'Start Research' }}
|
||||
</button>
|
||||
</form>
|
||||
</Modal>
|
||||
|
||||
<!-- Bootstrap Modal -->
|
||||
<Modal v-if="showBootstrap" title="Bootstrap Project" @close="showBootstrap = false">
|
||||
<form @submit.prevent="runBootstrap" class="space-y-3">
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
<script setup lang="ts">
|
||||
import { ref, onMounted, computed, watch } from 'vue'
|
||||
import { useRoute, useRouter } from 'vue-router'
|
||||
import { api, type ProjectDetail, type AuditResult } from '../api'
|
||||
import { api, type ProjectDetail, type AuditResult, type Phase } from '../api'
|
||||
import Badge from '../components/Badge.vue'
|
||||
import Modal from '../components/Modal.vue'
|
||||
|
||||
|
|
@ -12,7 +12,93 @@ const router = useRouter()
|
|||
const project = ref<ProjectDetail | null>(null)
|
||||
const loading = ref(true)
|
||||
const error = ref('')
|
||||
const activeTab = ref<'tasks' | 'decisions' | 'modules'>('tasks')
|
||||
const activeTab = ref<'tasks' | 'phases' | 'decisions' | 'modules'>('tasks')
|
||||
|
||||
// Phases
|
||||
const phases = ref<Phase[]>([])
|
||||
const phasesLoading = ref(false)
|
||||
const phaseError = ref('')
|
||||
const showReviseModal = ref(false)
|
||||
const revisePhaseId = ref<number | null>(null)
|
||||
const reviseComment = ref('')
|
||||
const reviseError = ref('')
|
||||
const reviseSaving = ref(false)
|
||||
const showRejectModal = ref(false)
|
||||
const rejectPhaseId = ref<number | null>(null)
|
||||
const rejectReason = ref('')
|
||||
const rejectError = ref('')
|
||||
const rejectSaving = ref(false)
|
||||
|
||||
async function loadPhases() {
|
||||
phasesLoading.value = true
|
||||
phaseError.value = ''
|
||||
try {
|
||||
phases.value = await api.getPhases(props.id)
|
||||
} catch (e: any) {
|
||||
phaseError.value = e.message
|
||||
} finally {
|
||||
phasesLoading.value = false
|
||||
}
|
||||
}
|
||||
|
||||
async function approvePhase(phaseId: number) {
|
||||
try {
|
||||
await api.approvePhase(phaseId)
|
||||
await loadPhases()
|
||||
} catch (e: any) {
|
||||
phaseError.value = e.message
|
||||
}
|
||||
}
|
||||
|
||||
function openRevise(phaseId: number) {
|
||||
revisePhaseId.value = phaseId
|
||||
reviseComment.value = ''
|
||||
reviseError.value = ''
|
||||
showReviseModal.value = true
|
||||
}
|
||||
|
||||
async function submitRevise() {
|
||||
if (!reviseComment.value.trim()) { reviseError.value = 'Comment required'; return }
|
||||
reviseSaving.value = true
|
||||
try {
|
||||
await api.revisePhase(revisePhaseId.value!, reviseComment.value)
|
||||
showReviseModal.value = false
|
||||
await loadPhases()
|
||||
} catch (e: any) {
|
||||
reviseError.value = e.message
|
||||
} finally {
|
||||
reviseSaving.value = false
|
||||
}
|
||||
}
|
||||
|
||||
function openReject(phaseId: number) {
|
||||
rejectPhaseId.value = phaseId
|
||||
rejectReason.value = ''
|
||||
rejectError.value = ''
|
||||
showRejectModal.value = true
|
||||
}
|
||||
|
||||
async function submitReject() {
|
||||
if (!rejectReason.value.trim()) { rejectError.value = 'Reason required'; return }
|
||||
rejectSaving.value = true
|
||||
try {
|
||||
await api.rejectPhase(rejectPhaseId.value!, rejectReason.value)
|
||||
showRejectModal.value = false
|
||||
await loadPhases()
|
||||
} catch (e: any) {
|
||||
rejectError.value = e.message
|
||||
} finally {
|
||||
rejectSaving.value = false
|
||||
}
|
||||
}
|
||||
|
||||
function phaseStatusColor(s: string) {
|
||||
const m: Record<string, string> = {
|
||||
pending: 'gray', active: 'blue', approved: 'green',
|
||||
rejected: 'red', revising: 'yellow',
|
||||
}
|
||||
return m[s] || 'gray'
|
||||
}
|
||||
|
||||
// Filters
|
||||
const ALL_TASK_STATUSES = ['pending', 'in_progress', 'review', 'blocked', 'decomposed', 'done', 'cancelled']
|
||||
|
|
@ -149,7 +235,12 @@ watch(selectedStatuses, (val) => {
|
|||
router.replace({ query: { ...route.query, status: val.length ? val.join(',') : undefined } })
|
||||
}, { deep: true })
|
||||
|
||||
onMounted(async () => { await load(); loadMode(); loadAutocommit() })
|
||||
onMounted(async () => {
|
||||
await load()
|
||||
loadMode()
|
||||
loadAutocommit()
|
||||
await loadPhases()
|
||||
})
|
||||
|
||||
const taskCategories = computed(() => {
|
||||
if (!project.value) return []
|
||||
|
|
@ -288,16 +379,29 @@ async function addDecision() {
|
|||
<h1 class="text-xl font-bold text-gray-100">{{ project.id }}</h1>
|
||||
<span class="text-gray-400">{{ project.name }}</span>
|
||||
<Badge :text="project.status" :color="project.status === 'active' ? 'green' : 'gray'" />
|
||||
<Badge v-if="project.project_type && project.project_type !== 'development'"
|
||||
:text="project.project_type"
|
||||
:color="project.project_type === 'operations' ? 'orange' : 'green'" />
|
||||
</div>
|
||||
<div class="flex gap-2 flex-wrap mb-2" v-if="project.tech_stack?.length">
|
||||
<Badge v-for="t in project.tech_stack" :key="t" :text="t" color="purple" />
|
||||
</div>
|
||||
<p class="text-xs text-gray-600">{{ project.path }}</p>
|
||||
<!-- Path (development / research) -->
|
||||
<p v-if="project.project_type !== 'operations'" class="text-xs text-gray-600">{{ project.path }}</p>
|
||||
<!-- SSH info (operations) -->
|
||||
<div v-if="project.project_type === 'operations'" class="flex flex-wrap gap-3 text-xs text-gray-600">
|
||||
<span v-if="project.ssh_host">
|
||||
<span class="text-gray-500">SSH:</span>
|
||||
<span class="text-orange-400 ml-1">{{ project.ssh_user || 'root' }}@{{ project.ssh_host }}</span>
|
||||
</span>
|
||||
<span v-if="project.ssh_key_path">key: {{ project.ssh_key_path }}</span>
|
||||
<span v-if="project.ssh_proxy_jump">via: {{ project.ssh_proxy_jump }}</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Tabs -->
|
||||
<div class="flex gap-1 mb-4 border-b border-gray-800">
|
||||
<button v-for="tab in (['tasks', 'decisions', 'modules'] as const)" :key="tab"
|
||||
<button v-for="tab in (['tasks', 'phases', 'decisions', 'modules'] as const)" :key="tab"
|
||||
@click="activeTab = tab"
|
||||
class="px-4 py-2 text-sm border-b-2 transition-colors"
|
||||
:class="activeTab === tab
|
||||
|
|
@ -306,6 +410,7 @@ async function addDecision() {
|
|||
{{ tab.charAt(0).toUpperCase() + tab.slice(1) }}
|
||||
<span class="text-xs text-gray-600 ml-1">
|
||||
{{ tab === 'tasks' ? project.tasks.length
|
||||
: tab === 'phases' ? phases.length
|
||||
: tab === 'decisions' ? project.decisions.length
|
||||
: project.modules.length }}
|
||||
</span>
|
||||
|
|
@ -449,6 +554,83 @@ async function addDecision() {
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Phases Tab -->
|
||||
<div v-if="activeTab === 'phases'">
|
||||
<p v-if="phasesLoading" class="text-gray-500 text-sm">Loading phases...</p>
|
||||
<p v-else-if="phaseError" class="text-red-400 text-sm">{{ phaseError }}</p>
|
||||
<div v-else-if="phases.length === 0" class="text-gray-600 text-sm">
|
||||
No research phases. Use "New Project" to start a research workflow.
|
||||
</div>
|
||||
<div v-else class="space-y-2">
|
||||
<div v-for="ph in phases" :key="ph.id"
|
||||
class="px-4 py-3 border rounded transition-colors"
|
||||
:class="{
|
||||
'border-blue-700 bg-blue-950/20': ph.status === 'active',
|
||||
'border-green-800 bg-green-950/10': ph.status === 'approved',
|
||||
'border-red-800 bg-red-950/10': ph.status === 'rejected',
|
||||
'border-yellow-700 bg-yellow-950/10': ph.status === 'revising',
|
||||
'border-gray-800': ph.status === 'pending',
|
||||
}">
|
||||
<div class="flex items-center justify-between">
|
||||
<div class="flex items-center gap-2">
|
||||
<span class="text-xs text-gray-600 w-5 text-right">{{ ph.phase_order + 1 }}.</span>
|
||||
<Badge :text="ph.status" :color="phaseStatusColor(ph.status)" />
|
||||
<span class="text-sm font-medium text-gray-200">{{ ph.role.replace(/_/g, ' ') }}</span>
|
||||
<span v-if="ph.revise_count" class="text-xs text-yellow-500">(revise ×{{ ph.revise_count }})</span>
|
||||
</div>
|
||||
<div class="flex items-center gap-2">
|
||||
<router-link v-if="ph.task_id"
|
||||
:to="`/task/${ph.task_id}`"
|
||||
class="text-xs text-blue-400 hover:text-blue-300 no-underline">
|
||||
{{ ph.task_id }}
|
||||
</router-link>
|
||||
<template v-if="ph.status === 'active'">
|
||||
<button @click="approvePhase(ph.id)"
|
||||
class="px-2 py-0.5 text-xs bg-green-900/40 text-green-400 border border-green-800 rounded hover:bg-green-900">
|
||||
Approve
|
||||
</button>
|
||||
<button @click="openRevise(ph.id)"
|
||||
class="px-2 py-0.5 text-xs bg-yellow-900/40 text-yellow-400 border border-yellow-800 rounded hover:bg-yellow-900">
|
||||
Revise
|
||||
</button>
|
||||
<button @click="openReject(ph.id)"
|
||||
class="px-2 py-0.5 text-xs bg-red-900/40 text-red-400 border border-red-800 rounded hover:bg-red-900">
|
||||
Reject
|
||||
</button>
|
||||
</template>
|
||||
</div>
|
||||
</div>
|
||||
<div v-if="ph.task?.title" class="mt-1 text-xs text-gray-500 ml-7">{{ ph.task.title }}</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Revise Modal -->
|
||||
<Modal v-if="showReviseModal" title="Request Revision" @close="showReviseModal = false">
|
||||
<div class="space-y-3">
|
||||
<textarea v-model="reviseComment" placeholder="Что нужно доработать?" rows="4"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600 resize-none"></textarea>
|
||||
<p v-if="reviseError" class="text-red-400 text-xs">{{ reviseError }}</p>
|
||||
<button @click="submitRevise" :disabled="reviseSaving"
|
||||
class="w-full py-2 bg-yellow-900/50 text-yellow-400 border border-yellow-800 rounded text-sm hover:bg-yellow-900 disabled:opacity-50">
|
||||
{{ reviseSaving ? 'Saving...' : 'Send for Revision' }}
|
||||
</button>
|
||||
</div>
|
||||
</Modal>
|
||||
|
||||
<!-- Reject Modal -->
|
||||
<Modal v-if="showRejectModal" title="Reject Phase" @close="showRejectModal = false">
|
||||
<div class="space-y-3">
|
||||
<textarea v-model="rejectReason" placeholder="Причина отклонения" rows="3"
|
||||
class="w-full bg-gray-800 border border-gray-700 rounded px-3 py-2 text-sm text-gray-200 placeholder-gray-600 resize-none"></textarea>
|
||||
<p v-if="rejectError" class="text-red-400 text-xs">{{ rejectError }}</p>
|
||||
<button @click="submitReject" :disabled="rejectSaving"
|
||||
class="w-full py-2 bg-red-900/50 text-red-400 border border-red-800 rounded text-sm hover:bg-red-900 disabled:opacity-50">
|
||||
{{ rejectSaving ? 'Saving...' : 'Reject' }}
|
||||
</button>
|
||||
</div>
|
||||
</Modal>
|
||||
</div>
|
||||
|
||||
<!-- Decisions Tab -->
|
||||
<div v-if="activeTab === 'decisions'">
|
||||
<div class="flex items-center justify-between mb-3">
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue