From 09483c9f071da2320fda2223a66f5cae48659181 Mon Sep 17 00:00:00 2001 From: catlog22 Date: Sun, 21 Dec 2025 21:10:55 +0800 Subject: [PATCH] =?UTF-8?q?feat:=20=E6=B7=BB=E5=8A=A0=E6=99=BA=E8=83=BD?= =?UTF-8?q?=E4=BB=A3=E7=A0=81=E6=B8=85=E7=90=86=E5=91=BD=E4=BB=A4=EF=BC=8C?= =?UTF-8?q?=E6=94=AF=E6=8C=81=E4=B8=BB=E7=BA=BF=E6=A3=80=E6=B5=8B=E5=92=8C?= =?UTF-8?q?=E8=BF=87=E6=97=B6=E5=B7=A5=E4=BB=B6=E5=8F=91=E7=8E=B0?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .claude/commands/clean.md | 516 ++++++++++++++++ .claude/commands/workflow/session/complete.md | 574 ++++-------------- 2 files changed, 629 insertions(+), 461 deletions(-) create mode 100644 .claude/commands/clean.md diff --git a/.claude/commands/clean.md b/.claude/commands/clean.md new file mode 100644 index 00000000..99b079f5 --- /dev/null +++ b/.claude/commands/clean.md @@ -0,0 +1,516 @@ +--- +name: clean +description: Intelligent code cleanup with mainline detection, stale artifact discovery, and safe execution +argument-hint: "[--dry-run] [\"focus area\"]" +allowed-tools: TodoWrite(*), Task(*), AskUserQuestion(*), Read(*), Glob(*), Bash(*), Write(*) +--- + +# Clean Command (/clean) + +## Overview + +Intelligent cleanup command that explores the codebase to identify the development mainline, discovers artifacts that have drifted from it, and safely removes stale sessions, abandoned documents, and dead code. + +**Core capabilities:** +- Mainline detection: Identify active development branches and core modules +- Drift analysis: Find sessions, documents, and code that deviate from mainline +- Intelligent discovery: cli-explore-agent based artifact scanning +- Safe execution: Confirmation-based cleanup with dry-run preview + +## Usage + +```bash +/clean # Full intelligent cleanup (explore → analyze → confirm → execute) +/clean --dry-run # Explore and analyze only, no execution +/clean "auth module" # Focus cleanup on specific area +``` + +## Execution Process + +``` +Phase 1: Mainline Detection + ├─ Analyze git history for development trends + ├─ Identify core modules (high commit frequency) + ├─ Map active vs stale branches + └─ Build mainline profile + +Phase 2: Drift Discovery (cli-explore-agent) + ├─ Scan workflow sessions for orphaned artifacts + ├─ Identify documents drifted from mainline + ├─ Detect dead code and unused exports + └─ Generate cleanup manifest + +Phase 3: Confirmation + ├─ Display cleanup summary by category + ├─ Show impact analysis (files, size, risk) + └─ AskUserQuestion: Select categories to clean + +Phase 4: Execution (unless --dry-run) + ├─ Execute cleanup by category + ├─ Update manifests and indexes + └─ Report results +``` + +## Implementation + +### Phase 1: Mainline Detection + +**Session Setup**: +```javascript +const getUtc8ISOString = () => new Date(Date.now() + 8 * 60 * 60 * 1000).toISOString() + +const dateStr = getUtc8ISOString().substring(0, 10) +const sessionId = `clean-${dateStr}` +const sessionFolder = `.workflow/.clean/${sessionId}` + +Bash(`mkdir -p ${sessionFolder}`) +``` + +**Step 1.1: Git History Analysis** +```bash +# Get commit frequency by directory (last 30 days) +bash(git log --since="30 days ago" --name-only --pretty=format: | grep -v "^$" | cut -d/ -f1-2 | sort | uniq -c | sort -rn | head -20) + +# Get recent active branches +bash(git for-each-ref --sort=-committerdate refs/heads/ --format='%(refname:short) %(committerdate:relative)' | head -10) + +# Get files with most recent changes +bash(git log --since="7 days ago" --name-only --pretty=format: | grep -v "^$" | sort | uniq -c | sort -rn | head -30) +``` + +**Step 1.2: Build Mainline Profile** +```javascript +const mainlineProfile = { + coreModules: [], // High-frequency directories + activeFiles: [], // Recently modified files + activeBranches: [], // Branches with recent commits + staleThreshold: { + sessions: 7, // Days + branches: 30, + documents: 14 + }, + timestamp: getUtc8ISOString() +} + +// Parse git log output to identify core modules +// Modules with >5 commits in last 30 days = core +// Modules with 0 commits in last 30 days = potentially stale + +Write(`${sessionFolder}/mainline-profile.json`, JSON.stringify(mainlineProfile, null, 2)) +``` + +--- + +### Phase 2: Drift Discovery + +**Launch cli-explore-agent for intelligent artifact scanning**: + +```javascript +Task( + subagent_type="cli-explore-agent", + run_in_background=false, + description="Discover stale artifacts", + prompt=` +## Task Objective +Discover artifacts that have drifted from the development mainline. Identify stale sessions, abandoned documents, and dead code for cleanup. + +## Context +- **Session Folder**: ${sessionFolder} +- **Mainline Profile**: ${sessionFolder}/mainline-profile.json +- **Focus Area**: ${focusArea || "全项目"} + +## Discovery Categories + +### Category 1: Stale Workflow Sessions +Scan and analyze workflow session directories: + +**Locations to scan**: +- .workflow/active/WFS-* (active sessions) +- .workflow/archives/WFS-* (archived sessions) +- .workflow/.lite-plan/* (lite-plan sessions) +- .workflow/.debug/DBG-* (debug sessions) + +**Staleness criteria**: +- Active sessions: No modification >7 days + no related git commits +- Archives: >30 days old + no feature references in project.json +- Lite-plan: >7 days old + plan.json not executed +- Debug: >3 days old + issue not in recent commits + +**Analysis steps**: +1. List all session directories with modification times +2. Cross-reference with git log (are session topics in recent commits?) +3. Check manifest.json for orphan entries +4. Identify sessions with .archiving marker (interrupted) + +### Category 2: Drifted Documents +Scan documentation that no longer aligns with code: + +**Locations to scan**: +- .claude/rules/tech/* (generated tech rules) +- .workflow/.scratchpad/* (temporary notes) +- **/CLAUDE.md (module documentation) +- **/README.md (outdated descriptions) + +**Drift criteria**: +- Tech rules: Referenced files no longer exist +- Scratchpad: Any file (always temporary) +- Module docs: Describe functions/classes that were removed +- READMEs: Reference deleted directories + +**Analysis steps**: +1. Parse document content for file/function references +2. Verify referenced entities still exist in codebase +3. Flag documents with >30% broken references + +### Category 3: Dead Code +Identify code that is no longer used: + +**Scan patterns**: +- Unused exports (exported but never imported) +- Orphan files (not imported anywhere) +- Commented-out code blocks (>10 lines) +- TODO/FIXME comments >90 days old + +**Analysis steps**: +1. Build import graph using rg/grep +2. Identify exports with no importers +3. Find files not in import graph +4. Scan for large comment blocks + +## Output Format + +Write to: ${sessionFolder}/cleanup-manifest.json + +\`\`\`json +{ + "generated_at": "ISO timestamp", + "mainline_summary": { + "core_modules": ["src/core", "src/api"], + "active_branches": ["main", "feature/auth"], + "health_score": 0.85 + }, + "discoveries": { + "stale_sessions": [ + { + "path": ".workflow/active/WFS-old-feature", + "type": "active", + "age_days": 15, + "reason": "No related commits in 15 days", + "size_kb": 1024, + "risk": "low" + } + ], + "drifted_documents": [ + { + "path": ".claude/rules/tech/deprecated-lib", + "type": "tech_rules", + "broken_references": 5, + "total_references": 6, + "drift_percentage": 83, + "reason": "Referenced library removed", + "risk": "low" + } + ], + "dead_code": [ + { + "path": "src/utils/legacy.ts", + "type": "orphan_file", + "reason": "Not imported by any file", + "last_modified": "2025-10-01", + "risk": "medium" + } + ] + }, + "summary": { + "total_items": 12, + "total_size_mb": 45.2, + "by_category": { + "stale_sessions": 5, + "drifted_documents": 4, + "dead_code": 3 + }, + "by_risk": { + "low": 8, + "medium": 3, + "high": 1 + } + } +} +\`\`\` + +## Execution Commands + +\`\`\`bash +# Session directories +find .workflow -type d -name "WFS-*" -o -name "DBG-*" 2>/dev/null + +# Check modification times (Linux/Mac) +stat -c "%Y %n" .workflow/active/WFS-* 2>/dev/null + +# Check modification times (Windows PowerShell via bash) +powershell -Command "Get-ChildItem '.workflow/active/WFS-*' | ForEach-Object { Write-Output \"$($_.LastWriteTime) $($_.FullName)\" }" + +# Find orphan exports (TypeScript) +rg "export (const|function|class|interface|type)" --type ts -l + +# Find imports +rg "import.*from" --type ts + +# Find large comment blocks +rg "^\\s*/\\*" -A 10 --type ts + +# Find old TODOs +rg "TODO|FIXME" --type ts -n +\`\`\` + +## Success Criteria +- [ ] All session directories scanned with age calculation +- [ ] Documents cross-referenced with existing code +- [ ] Dead code detection via import graph analysis +- [ ] cleanup-manifest.json written with complete data +- [ ] Each item has risk level and cleanup reason +` +) +``` + +--- + +### Phase 3: Confirmation + +**Step 3.1: Display Summary** +```javascript +const manifest = JSON.parse(Read(`${sessionFolder}/cleanup-manifest.json`)) + +console.log(` +## Cleanup Discovery Report + +**Mainline Health**: ${Math.round(manifest.mainline_summary.health_score * 100)}% +**Core Modules**: ${manifest.mainline_summary.core_modules.join(', ')} + +### Summary +| Category | Count | Size | Risk | +|----------|-------|------|------| +| Stale Sessions | ${manifest.summary.by_category.stale_sessions} | - | ${getRiskSummary('sessions')} | +| Drifted Documents | ${manifest.summary.by_category.drifted_documents} | - | ${getRiskSummary('documents')} | +| Dead Code | ${manifest.summary.by_category.dead_code} | - | ${getRiskSummary('code')} | + +**Total**: ${manifest.summary.total_items} items, ~${manifest.summary.total_size_mb} MB + +### Stale Sessions +${manifest.discoveries.stale_sessions.map(s => + `- ${s.path} (${s.age_days}d, ${s.risk}): ${s.reason}` +).join('\n')} + +### Drifted Documents +${manifest.discoveries.drifted_documents.map(d => + `- ${d.path} (${d.drift_percentage}% broken, ${d.risk}): ${d.reason}` +).join('\n')} + +### Dead Code +${manifest.discoveries.dead_code.map(c => + `- ${c.path} (${c.type}, ${c.risk}): ${c.reason}` +).join('\n')} +`) +``` + +**Step 3.2: Dry-Run Exit** +```javascript +if (flags.includes('--dry-run')) { + console.log(` +--- +**Dry-run mode**: No changes made. +Manifest saved to: ${sessionFolder}/cleanup-manifest.json + +To execute cleanup: /clean +`) + return +} +``` + +**Step 3.3: User Confirmation** +```javascript +AskUserQuestion({ + questions: [ + { + question: "Which categories to clean?", + header: "Categories", + multiSelect: true, + options: [ + { + label: "Sessions", + description: `${manifest.summary.by_category.stale_sessions} stale workflow sessions` + }, + { + label: "Documents", + description: `${manifest.summary.by_category.drifted_documents} drifted documents` + }, + { + label: "Dead Code", + description: `${manifest.summary.by_category.dead_code} unused code files` + } + ] + }, + { + question: "Risk level to include?", + header: "Risk", + multiSelect: false, + options: [ + { label: "Low only", description: "Safest - only obviously stale items" }, + { label: "Low + Medium", description: "Recommended - includes likely unused items" }, + { label: "All", description: "Aggressive - includes high-risk items" } + ] + } + ] +}) +``` + +--- + +### Phase 4: Execution + +**Step 4.1: Filter Items by Selection** +```javascript +const selectedCategories = userSelection.categories // ['Sessions', 'Documents', ...] +const riskLevel = userSelection.risk // 'Low only', 'Low + Medium', 'All' + +const riskFilter = { + 'Low only': ['low'], + 'Low + Medium': ['low', 'medium'], + 'All': ['low', 'medium', 'high'] +}[riskLevel] + +const itemsToClean = [] + +if (selectedCategories.includes('Sessions')) { + itemsToClean.push(...manifest.discoveries.stale_sessions.filter(s => riskFilter.includes(s.risk))) +} +if (selectedCategories.includes('Documents')) { + itemsToClean.push(...manifest.discoveries.drifted_documents.filter(d => riskFilter.includes(d.risk))) +} +if (selectedCategories.includes('Dead Code')) { + itemsToClean.push(...manifest.discoveries.dead_code.filter(c => riskFilter.includes(c.risk))) +} + +TodoWrite({ + todos: itemsToClean.map(item => ({ + content: `Clean: ${item.path}`, + status: "pending", + activeForm: `Cleaning ${item.path}` + })) +}) +``` + +**Step 4.2: Execute Cleanup** +```javascript +const results = { deleted: [], failed: [], skipped: [] } + +for (const item of itemsToClean) { + TodoWrite({ todos: [...] }) // Mark current as in_progress + + try { + if (item.type === 'orphan_file' || item.type === 'dead_export') { + // Dead code: Delete file or remove export + Bash({ command: `rm -rf "${item.path}"` }) + } else { + // Sessions and documents: Delete directory/file + Bash({ command: `rm -rf "${item.path}"` }) + } + + results.deleted.push(item.path) + TodoWrite({ todos: [...] }) // Mark as completed + } catch (error) { + results.failed.push({ path: item.path, error: error.message }) + } +} +``` + +**Step 4.3: Update Manifests** +```javascript +// Update archives manifest if sessions were deleted +if (selectedCategories.includes('Sessions')) { + const archiveManifestPath = '.workflow/archives/manifest.json' + if (fileExists(archiveManifestPath)) { + const archiveManifest = JSON.parse(Read(archiveManifestPath)) + const deletedSessionIds = results.deleted + .filter(p => p.includes('WFS-')) + .map(p => p.split('/').pop()) + + const updatedManifest = archiveManifest.filter(entry => + !deletedSessionIds.includes(entry.session_id) + ) + + Write(archiveManifestPath, JSON.stringify(updatedManifest, null, 2)) + } +} + +// Update project.json if features referenced deleted sessions +const projectPath = '.workflow/project.json' +if (fileExists(projectPath)) { + const project = JSON.parse(Read(projectPath)) + const deletedPaths = new Set(results.deleted) + + project.features = project.features.filter(f => + !deletedPaths.has(f.traceability?.archive_path) + ) + + project.statistics.total_features = project.features.length + project.statistics.last_updated = getUtc8ISOString() + + Write(projectPath, JSON.stringify(project, null, 2)) +} +``` + +**Step 4.4: Report Results** +```javascript +console.log(` +## Cleanup Complete + +**Deleted**: ${results.deleted.length} items +**Failed**: ${results.failed.length} items +**Skipped**: ${results.skipped.length} items + +### Deleted Items +${results.deleted.map(p => `- ${p}`).join('\n')} + +${results.failed.length > 0 ? ` +### Failed Items +${results.failed.map(f => `- ${f.path}: ${f.error}`).join('\n')} +` : ''} + +Cleanup manifest archived to: ${sessionFolder}/cleanup-manifest.json +`) +``` + +--- + +## Session Folder Structure + +``` +.workflow/.clean/{YYYY-MM-DD}/ +├── mainline-profile.json # Git history analysis +└── cleanup-manifest.json # Discovery results +``` + +## Risk Level Definitions + +| Risk | Description | Examples | +|------|-------------|----------| +| **Low** | Safe to delete, no dependencies | Empty sessions, scratchpad files, 100% broken docs | +| **Medium** | Likely unused, verify before delete | Orphan files, old archives, partially broken docs | +| **High** | May have hidden dependencies | Files with some imports, recent modifications | + +## Error Handling + +| Situation | Action | +|-----------|--------| +| No git repository | Skip mainline detection, use file timestamps only | +| Session in use (.archiving) | Skip with warning | +| Permission denied | Report error, continue with others | +| Manifest parse error | Regenerate from filesystem scan | +| Empty discovery | Report "codebase is clean" | + +## Related Commands + +- `/workflow:session:complete` - Properly archive active sessions +- `/memory:compact` - Save session memory before cleanup +- `/workflow:status` - View current workflow state diff --git a/.claude/commands/workflow/session/complete.md b/.claude/commands/workflow/session/complete.md index 29fa143e..fe3d5b28 100644 --- a/.claude/commands/workflow/session/complete.md +++ b/.claude/commands/workflow/session/complete.md @@ -8,494 +8,146 @@ examples: # Complete Workflow Session (/workflow:session:complete) -## Overview -Mark the currently active workflow session as complete, analyze it for lessons learned, move it to the archive directory, and remove the active flag marker. +Mark the currently active workflow session as complete, archive it, and update manifests. -## Usage -```bash -/workflow:session:complete # Complete current active session -/workflow:session:complete --detailed # Show detailed completion summary -``` - -## Implementation Flow - -### Phase 1: Pre-Archival Preparation (Transactional Setup) - -**Purpose**: Find active session, create archiving marker to prevent concurrent operations. Session remains in active location for agent processing. - -#### Step 1.1: Find Active Session and Get Name -```bash -# Find active session directory -bash(find .workflow/active/ -name "WFS-*" -type d | head -1) - -# Extract session name from directory path -bash(basename .workflow/active/WFS-session-name) -``` -**Output**: Session name `WFS-session-name` - -#### Step 1.2: Check for Existing Archiving Marker (Resume Detection) -```bash -# Check if session is already being archived -bash(test -f .workflow/active/WFS-session-name/.archiving && echo "RESUMING" || echo "NEW") -``` - -**If RESUMING**: -- Previous archival attempt was interrupted -- Skip to Phase 2 to resume agent analysis - -**If NEW**: -- Continue to Step 1.3 - -#### Step 1.3: Create Archiving Marker -```bash -# Mark session as "archiving in progress" -bash(touch .workflow/active/WFS-session-name/.archiving) -``` -**Purpose**: -- Prevents concurrent operations on this session -- Enables recovery if archival fails -- Session remains in `.workflow/active/` for agent analysis - -**Result**: Session still at `.workflow/active/WFS-session-name/` with `.archiving` marker - -### Phase 2: Agent Analysis (In-Place Processing) - -**Purpose**: Agent analyzes session WHILE STILL IN ACTIVE LOCATION. Generates metadata but does NOT move files or update manifest. - -#### Agent Invocation - -Invoke `universal-executor` agent to analyze session and prepare archive metadata. - -**Agent Task**: -``` -Task( - subagent_type="universal-executor", - run_in_background=false, - description="Analyze session for archival", - prompt=` -Analyze workflow session for archival preparation. Session is STILL in active location. - -## Context -- Session: .workflow/active/WFS-session-name/ -- Status: Marked as archiving (.archiving marker present) -- Location: Active sessions directory (NOT archived yet) - -## Tasks - -1. **Extract session data** from workflow-session.json - - session_id, description/topic, started_at, completed_at, status - - If status != "completed", update it with timestamp - -2. **Count files**: tasks (.task/*.json) and summaries (.summaries/*.md) - -3. **Extract review data** (if .review/ exists): - - Count dimension results: .review/dimensions/*.json - - Count deep-dive results: .review/iterations/*.json - - Extract findings summary from dimension JSONs (total, critical, high, medium, low) - - Check fix results if .review/fixes/ exists (fixed_count, failed_count) - - Build review_metrics: {dimensions_analyzed, total_findings, severity_distribution, fix_success_rate} - -4. **Generate lessons**: Use gemini with ~/.claude/workflows/cli-templates/prompts/archive/analysis-simple.txt - - Return: {successes, challenges, watch_patterns} - - If review data exists, include review-specific lessons (common issue patterns, effective fixes) - -5. **Build archive entry**: - - Calculate: duration_hours, success_rate, tags (3-5 keywords) - - Construct complete JSON with session_id, description, archived_at, metrics, tags, lessons - - Include archive_path: ".workflow/archives/WFS-session-name" (future location) - - If review data exists, include review_metrics in metrics object - -6. **Extract feature metadata** (for Phase 4): - - Parse IMPL_PLAN.md for title (first # heading) - - Extract description (first paragraph, max 200 chars) - - Generate feature tags (3-5 keywords from content) - -7. **Return result**: Complete metadata package for atomic commit - { - "status": "success", - "session_id": "WFS-session-name", - "archive_entry": { - "session_id": "...", - "description": "...", - "archived_at": "...", - "archive_path": ".workflow/archives/WFS-session-name", - "metrics": { - "duration_hours": 2.5, - "tasks_completed": 5, - "summaries_generated": 3, - "review_metrics": { // Optional, only if .review/ exists - "dimensions_analyzed": 4, - "total_findings": 15, - "severity_distribution": {"critical": 1, "high": 3, "medium": 8, "low": 3}, - "fix_success_rate": 0.87 // Optional, only if .review/fixes/ exists - } - }, - "tags": [...], - "lessons": {...} - }, - "feature_metadata": { - "title": "...", - "description": "...", - "tags": [...] - } - } - -## Important Constraints -- DO NOT move or delete any files -- DO NOT update manifest.json yet -- Session remains in .workflow/active/ during analysis -- Return complete metadata package for orchestrator to commit atomically - -## Error Handling -- On failure: return {"status": "error", "task": "...", "message": "..."} -- Do NOT modify any files on error - ` -) -``` - -**Expected Output**: -- Agent returns complete metadata package -- Session remains in `.workflow/active/` with `.archiving` marker -- No files moved or manifests updated yet - -### Phase 3: Atomic Commit (Transactional File Operations) - -**Purpose**: Atomically commit all changes. Only execute if Phase 2 succeeds. - -#### Step 3.1: Create Archive Directory -```bash -bash(mkdir -p .workflow/archives/) -``` - -#### Step 3.2: Move Session to Archive -```bash -bash(mv .workflow/active/WFS-session-name .workflow/archives/WFS-session-name) -``` -**Result**: Session now at `.workflow/archives/WFS-session-name/` - -#### Step 3.3: Update Manifest -```bash -# Read current manifest (or create empty array if not exists) -bash(test -f .workflow/archives/manifest.json && cat .workflow/archives/manifest.json || echo "[]") -``` - -**JSON Update Logic**: -```javascript -// Read agent result from Phase 2 -const agentResult = JSON.parse(agentOutput); -const archiveEntry = agentResult.archive_entry; - -// Read existing manifest -let manifest = []; -try { - const manifestContent = Read('.workflow/archives/manifest.json'); - manifest = JSON.parse(manifestContent); -} catch { - manifest = []; // Initialize if not exists -} - -// Append new entry -manifest.push(archiveEntry); - -// Write back -Write('.workflow/archives/manifest.json', JSON.stringify(manifest, null, 2)); -``` - -#### Step 3.4: Remove Archiving Marker -```bash -bash(rm .workflow/archives/WFS-session-name/.archiving) -``` -**Result**: Clean archived session without temporary markers - -**Output Confirmation**: -``` -✓ Session "${sessionId}" archived successfully - Location: .workflow/archives/WFS-session-name/ - Lessons: ${archiveEntry.lessons.successes.length} successes, ${archiveEntry.lessons.challenges.length} challenges - Manifest: Updated with ${manifest.length} total sessions - ${reviewMetrics ? `Review: ${reviewMetrics.total_findings} findings across ${reviewMetrics.dimensions_analyzed} dimensions, ${Math.round(reviewMetrics.fix_success_rate * 100)}% fixed` : ''} -``` - -### Phase 4: Update Project Feature Registry - -**Purpose**: Record completed session as a project feature in `.workflow/project.json`. - -**Execution**: Uses feature metadata from Phase 2 agent result to update project registry. - -#### Step 4.1: Check Project State Exists -```bash -bash(test -f .workflow/project.json && echo "EXISTS" || echo "SKIP") -``` - -**If SKIP**: Output warning and skip Phase 4 -``` -WARNING: No project.json found. Run /workflow:session:start to initialize. -``` - -#### Step 4.2: Extract Feature Information from Agent Result - -**Data Processing** (Uses Phase 2 agent output): -```javascript -// Extract feature metadata from agent result -const agentResult = JSON.parse(agentOutput); -const featureMeta = agentResult.feature_metadata; - -// Data already prepared by agent: -const title = featureMeta.title; -const description = featureMeta.description; -const tags = featureMeta.tags; - -// Create feature ID (lowercase slug) -const featureId = title.toLowerCase().replace(/[^a-z0-9]+/g, '-').substring(0, 50); -``` - -#### Step 4.3: Update project.json +## Pre-defined Commands ```bash -# Read current project state -bash(cat .workflow/project.json) +# Phase 1: Find active session +SESSION_PATH=$(find .workflow/active/ -maxdepth 1 -name "WFS-*" -type d | head -1) +SESSION_ID=$(basename "$SESSION_PATH") + +# Phase 3: Move to archive +mkdir -p .workflow/archives/ +mv .workflow/active/$SESSION_ID .workflow/archives/$SESSION_ID + +# Cleanup marker +rm -f .workflow/archives/$SESSION_ID/.archiving ``` -**JSON Update Logic**: -```javascript -// Read existing project.json (created by /workflow:init) -// Note: overview field is managed by /workflow:init, not modified here -const projectMeta = JSON.parse(Read('.workflow/project.json')); -const currentTimestamp = new Date().toISOString(); -const currentDate = currentTimestamp.split('T')[0]; // YYYY-MM-DD +## Key Files to Read -// Extract tags from IMPL_PLAN.md (simple keyword extraction) -const tags = extractTags(planContent); // e.g., ["auth", "security"] +**For manifest.json generation**, read ONLY these files: -// Build feature object with complete metadata -const newFeature = { - id: featureId, - title: title, - description: description, - status: "completed", - tags: tags, - timeline: { - created_at: currentTimestamp, - implemented_at: currentDate, - updated_at: currentTimestamp - }, - traceability: { - session_id: sessionId, - archive_path: archivePath, // e.g., ".workflow/archives/WFS-auth-system" - commit_hash: getLatestCommitHash() || "" // Optional: git rev-parse HEAD - }, - docs: [], // Placeholder for future doc links - relations: [] // Placeholder for feature dependencies -}; +| File | Extract | +|------|---------| +| `$SESSION_PATH/workflow-session.json` | session_id, description, started_at, status | +| `$SESSION_PATH/IMPL_PLAN.md` | title (first # heading), description (first paragraph) | +| `$SESSION_PATH/.tasks/*.json` | count files | +| `$SESSION_PATH/.summaries/*.md` | count files | +| `$SESSION_PATH/.review/dimensions/*.json` | count + findings summary (optional) | -// Add new feature to array -projectMeta.features.push(newFeature); +## Execution Flow -// Update statistics -projectMeta.statistics.total_features = projectMeta.features.length; -projectMeta.statistics.total_sessions += 1; -projectMeta.statistics.last_updated = currentTimestamp; +### Phase 1: Find Session (2 commands) -// Write back -Write('.workflow/project.json', JSON.stringify(projectMeta, null, 2)); +```bash +# 1. Find and extract session +SESSION_PATH=$(find .workflow/active/ -maxdepth 1 -name "WFS-*" -type d | head -1) +SESSION_ID=$(basename "$SESSION_PATH") + +# 2. Check/create archiving marker +test -f "$SESSION_PATH/.archiving" && echo "RESUMING" || touch "$SESSION_PATH/.archiving" ``` -**Helper Functions**: -```javascript -// Extract tags from IMPL_PLAN.md content -function extractTags(planContent) { - const tags = []; +**Output**: `SESSION_ID` = e.g., `WFS-auth-feature` - // Look for common keywords - const keywords = { - 'auth': /authentication|login|oauth|jwt/i, - 'security': /security|encrypt|hash|token/i, - 'api': /api|endpoint|rest|graphql/i, - 'ui': /component|page|interface|frontend/i, - 'database': /database|schema|migration|sql/i, - 'test': /test|testing|spec|coverage/i - }; +### Phase 2: Generate Manifest Entry (Read-only) - for (const [tag, pattern] of Object.entries(keywords)) { - if (pattern.test(planContent)) { - tags.push(tag); +Read the key files above, then build this structure: + +```json +{ + "session_id": "", + "description": "", + "archived_at": "", + "archive_path": ".workflow/archives/", + "metrics": { + "duration_hours": "<(completed_at - started_at) / 3600000>", + "tasks_completed": "", + "summaries_generated": "", + "review_metrics": { + "dimensions_analyzed": "", + "total_findings": "" } - } - - return tags.slice(0, 5); // Max 5 tags -} - -// Get latest git commit hash (optional) -function getLatestCommitHash() { - try { - const result = Bash({ - command: "git rev-parse --short HEAD 2>/dev/null", - description: "Get latest commit hash" - }); - return result.trim(); - } catch { - return ""; + }, + "tags": ["<3-5 keywords from IMPL_PLAN.md>"], + "lessons": { + "successes": [""], + "challenges": [""], + "watch_patterns": [""] } } ``` -#### Step 4.4: Output Confirmation +**Lessons Generation**: Use gemini with `~/.claude/workflows/cli-templates/prompts/archive/analysis-simple.txt` -``` -✓ Feature "${title}" added to project registry - ID: ${featureId} - Session: ${sessionId} - Location: .workflow/project.json +### Phase 3: Atomic Commit (4 commands) + +```bash +# 1. Create archive directory +mkdir -p .workflow/archives/ + +# 2. Move session +mv .workflow/active/$SESSION_ID .workflow/archives/$SESSION_ID + +# 3. Update manifest.json (Read → Append → Write) +# Read: .workflow/archives/manifest.json (or []) +# Append: archive_entry from Phase 2 +# Write: updated JSON + +# 4. Remove marker +rm -f .workflow/archives/$SESSION_ID/.archiving ``` -**Error Handling**: -- If project.json malformed: Output error, skip update -- If feature_metadata missing from agent result: Skip Phase 4 -- If extraction fails: Use minimal defaults +**Output**: +``` +✓ Session "$SESSION_ID" archived successfully + Location: .workflow/archives/$SESSION_ID/ + Manifest: Updated with N total sessions +``` -**Phase 4 Total Commands**: 1 bash read + JSON manipulation +### Phase 4: Update project.json (Optional) + +**Skip if**: `.workflow/project.json` doesn't exist + +```bash +# Check +test -f .workflow/project.json || echo "SKIP" +``` + +**If exists**, add feature entry: + +```json +{ + "id": "", + "title": "", + "status": "completed", + "tags": [""], + "timeline": { "implemented_at": "" }, + "traceability": { "session_id": "", "archive_path": "" } +} +``` + +**Output**: +``` +✓ Feature added to project registry +``` ## Error Recovery -### If Agent Fails (Phase 2) +| Phase | Symptom | Recovery | +|-------|---------|----------| +| 1 | No active session | `No active session found` | +| 2 | Analysis fails | Remove marker: `rm $SESSION_PATH/.archiving`, retry | +| 3 | Move fails | Session safe in active/, fix issue, retry | +| 3 | Manifest fails | Session in archives/, manually add entry, remove marker | -**Symptoms**: -- Agent returns `{"status": "error", ...}` -- Agent crashes or times out -- Analysis incomplete +## Quick Reference -**Recovery Steps**: -```bash -# Session still in .workflow/active/WFS-session-name -# Remove archiving marker -bash(rm .workflow/active/WFS-session-name/.archiving) ``` - -**User Notification**: +Phase 1: find session → create .archiving marker +Phase 2: read key files → build manifest entry (no writes) +Phase 3: mkdir → mv → update manifest.json → rm marker +Phase 4: update project.json features array (optional) ``` -ERROR: Session archival failed during analysis phase -Reason: [error message from agent] -Session remains active in: .workflow/active/WFS-session-name - -Recovery: -1. Fix any issues identified in error message -2. Retry: /workflow:session:complete - -Session state: SAFE (no changes committed) -``` - -### If Move Fails (Phase 3) - -**Symptoms**: -- `mv` command fails -- Permission denied -- Disk full - -**Recovery Steps**: -```bash -# Archiving marker still present -# Session still in .workflow/active/ (move failed) -# No manifest updated yet -``` - -**User Notification**: -``` -ERROR: Session archival failed during move operation -Reason: [mv error message] -Session remains in: .workflow/active/WFS-session-name - -Recovery: -1. Fix filesystem issues (permissions, disk space) -2. Retry: /workflow:session:complete - - System will detect .archiving marker - - Will resume from Phase 2 (agent analysis) - -Session state: SAFE (analysis complete, ready to retry move) -``` - -### If Manifest Update Fails (Phase 3) - -**Symptoms**: -- JSON parsing error -- Write permission denied -- Session moved but manifest not updated - -**Recovery Steps**: -```bash -# Session moved to .workflow/archives/WFS-session-name -# Manifest NOT updated -# Archiving marker still present in archived location -``` - -**User Notification**: -``` -ERROR: Session archived but manifest update failed -Reason: [error message] -Session location: .workflow/archives/WFS-session-name - -Recovery: -1. Fix manifest.json issues (syntax, permissions) -2. Manual manifest update: - - Add archive entry from agent output - - Remove .archiving marker: rm .workflow/archives/WFS-session-name/.archiving - -Session state: PARTIALLY COMPLETE (session archived, manifest needs update) -``` - -## Workflow Execution Strategy - -### Transactional Four-Phase Approach - -**Phase 1: Pre-Archival Preparation** (Marker creation) -- Find active session and extract name -- Check for existing `.archiving` marker (resume detection) -- Create `.archiving` marker if new -- **No data processing** - just state tracking -- **Total**: 2-3 bash commands (find + marker check/create) - -**Phase 2: Agent Analysis** (Read-only data processing) -- Extract all session data from active location -- Count tasks and summaries -- Extract review data if .review/ exists (dimension results, findings, fix results) -- Generate lessons learned analysis (including review-specific lessons if applicable) -- Extract feature metadata from IMPL_PLAN.md -- Build complete archive + feature metadata package (with review_metrics if applicable) -- **No file modifications** - pure analysis -- **Total**: 1 agent invocation - -**Phase 3: Atomic Commit** (Transactional file operations) -- Create archive directory -- Move session to archive location -- Update manifest.json with archive entry -- Remove `.archiving` marker -- **All-or-nothing**: Either all succeed or session remains in safe state -- **Total**: 4 bash commands + JSON manipulation - -**Phase 4: Project Registry Update** (Optional feature tracking) -- Check project.json exists -- Use feature metadata from Phase 2 agent result -- Build feature object with complete traceability -- Update project statistics -- **Independent**: Can fail without affecting archival -- **Total**: 1 bash read + JSON manipulation - -### Transactional Guarantees - -**State Consistency**: -- Session NEVER in inconsistent state -- `.archiving` marker enables safe resume -- Agent failure leaves session in recoverable state -- Move/manifest operations grouped in Phase 3 - -**Failure Isolation**: -- Phase 1 failure: No changes made -- Phase 2 failure: Session still active, can retry -- Phase 3 failure: Clear error state, manual recovery documented -- Phase 4 failure: Does not affect archival success - -**Resume Capability**: -- Detect interrupted archival via `.archiving` marker -- Resume from Phase 2 (skip marker creation) -- Idempotent operations (safe to retry) - -