refactor(ui-design): optimize workflow phases and token refinement strategy

**generate.md - Fix CSS variable naming mismatch**:
- Move token conversion to Phase 1.6 (before agent generation)
- Add Phase 1.7 to extract actual variable names from tokens.css
- Update Phase 2a agent prompt to read tokens.css directly
- Eliminate variable name guessing - agent uses actual file reference
- Remove redundant token conversion from Phase 2b

Benefits:
-  Eliminates CSS variable name mismatches
-  Single source of truth (tokens.css)
-  More reliable agent generation
-  Easier debugging

**consolidate.md - Replace MCP research with philosophy-driven refinement**:
- Remove variant-specific MCP trend research (4 queries per variant)
- Use design_attributes from extraction phase as refinement rules
- Apply philosophy-driven token generation (no external calls)
- Preserve variant divergence through anti_keywords constraints
- Faster execution (~30-60s saved per variant)

Benefits:
-  Better divergence preservation
-  Faster workflow execution
-  Pure AI-driven refinement
-  Consistent with extraction phase philosophy

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
catlog22
2025-10-10 13:24:48 +08:00
parent e3f8283386
commit cb6e44efde
2 changed files with 159 additions and 132 deletions

View File

@@ -89,7 +89,7 @@ IF exists(design_space_path):
design_space_analysis = Read(design_space_path)
REPORT: "📊 Loaded design space analysis with {len(design_space_analysis.divergent_directions)} variant directions"
ELSE:
REPORT: "⚠️ No design space analysis found - trend research will be skipped"
REPORT: "⚠️ No design space analysis found - will refine tokens from proposed_tokens only"
```
### Phase 4: Design System Synthesis (Agent Execution)
@@ -107,11 +107,11 @@ agent_task_prompt = """
CRITICAL: You MUST use Write() tool to create files. DO NOT return file contents as text.
## Task Summary
Generate {variants_count} independent design systems with MCP trend research and WRITE files directly.
Generate {variants_count} independent design systems using philosophy-driven refinement and WRITE files directly (NO MCP calls).
## Context
SESSION: {session_id}
MODE: Separate design system generation with MCP trend research
MODE: Separate design system generation with philosophy-driven refinement (NO MCP)
BASE_PATH: {base_path}
VARIANTS TO PROCESS: {variants_count}
@@ -128,80 +128,74 @@ VARIANTS TO PROCESS: {variants_count}
{IF design_context: DESIGN CONTEXT (from brainstorming): {design_context}}
{IF design_space_analysis:
## Design Space Analysis (for MCP Research)
## Design Space Analysis (for Philosophy-Driven Refinement)
{JSON.stringify(design_space_analysis, null, 2)}
Note: Each variant has search_keywords and anti_keywords for trend research.
Note: Each variant has design_attributes and anti_keywords for token refinement.
Use philosophy_name and design_attributes to guide token generation WITHOUT external research.
}
## Task
For EACH variant (1 to {variants_count}):
1. **Perform Variant-Specific Trend Research** (if design space analysis available)
2. **Refine design tokens** using research insights
1. **Load variant's design philosophy and attributes** (from design_space_analysis)
2. **Refine design tokens** using philosophy-driven strategy (NO external research)
3. **Generate and WRITE 2 files** to the file system
## Step 1: Variant-Specific Trend Research (MCP)
## Step 1: Load Design Philosophy (No MCP Calls)
IF design_space_analysis is provided, FOR EACH variant:
IF design_space_analysis is provided:
FOR EACH variant:
1. **Extract Design Direction**: Load philosophy_name, design_attributes, search_keywords, anti_keywords
2. **Use as Refinement Guide**: Apply philosophy and attributes to token generation
3. **Enforce Constraints**: Avoid characteristics listed in anti_keywords
4. **Maintain Divergence**: Ensure tokens differ from other variants based on attributes
1. **Extract Research Parameters** from design space analysis:
- philosophy_name: The variant's design philosophy
- search_keywords: Keywords for trend research
- anti_keywords: Patterns to avoid
ELSE:
Refine tokens based solely on variant's proposed_tokens and design_philosophy from style-cards.json
2. **Build Variant-Specific Queries**:
```javascript
queries = [
`{philosophy_name} UI design color palettes {search_keywords[:3]} 2024 2025`,
`{philosophy_name} typography trends {search_keywords[:3]} web design 2024`,
`{philosophy_name} layout patterns {search_keywords[:3]} design systems 2024`,
`design systems {philosophy_name} NOT {anti_keywords[:2]}`
]
```
## Philosophy-Driven Refinement Strategy
3. **Execute MCP Searches**:
```javascript
trend_research = {
colors: mcp__exa__web_search_exa(query=queries[0], numResults=5),
typography: mcp__exa__web_search_exa(query=queries[1], numResults=5),
layout: mcp__exa__web_search_exa(query=queries[2], numResults=5),
contrast: mcp__exa__web_search_exa(query=queries[3], numResults=5)
}
```
**Core Principles**:
- Use variant's design_attributes as primary guide (color saturation, visual weight, formality, organic/geometric, innovation, density)
- Apply anti_keywords as explicit constraints during token selection
- Ensure WCAG AA accessibility using built-in AI knowledge (4.5:1 text, 3:1 UI)
- Preserve maximum contrast between variants from extraction phase
4. **Shared Accessibility Research** (execute once, apply to all variants):
```javascript
accessibility_guidelines = mcp__exa__web_search_exa(
query="WCAG 2.2 accessibility color contrast ARIA best practices 2024",
numResults=3
)
```
5. **Use Research Results** to inform token refinement:
- Color token refinement guided by `trend_research.colors`
- Typography refinement guided by `trend_research.typography`
- Layout spacing informed by `trend_research.layout`
- Contrast validation using `trend_research.contrast` and `accessibility_guidelines`
IF design_space_analysis is NOT provided:
- Skip trend research
- Refine tokens based solely on variant's existing philosophy and proposed tokens
**Refinement Process** (Apply to each variant):
1. **Colors**: Generate palette based on saturation attribute
- "monochrome" → low chroma values (oklch L 0.00-0.02 H)
- "vibrant" → high chroma values (oklch L 0.20-0.30 H)
2. **Typography**: Select font families matching formality level
- "playful" → rounded, friendly fonts
- "luxury" → serif, elegant fonts
3. **Spacing**: Apply density attribute
- "spacious" → larger spacing scale (e.g., "4": "1.5rem")
- "compact" → smaller spacing scale (e.g., "4": "0.75rem")
4. **Shadows**: Match visual weight
- "minimal" → subtle shadows with low opacity
- "bold" → strong shadows with higher spread
5. **Border Radius**: Align with organic/geometric attribute
- "organic" → larger radius values (xl: "1.5rem")
- "brutalist" → minimal radius (xl: "0.125rem")
6. **Innovation**: Influence overall token adventurousness
- "timeless" → conservative, proven values
- "experimental" → unconventional token combinations
## Step 2: Refinement Rules (apply to each variant)
1. **Complete Token Coverage**: Ensure all categories present (colors, typography, spacing, etc.)
2. **Fill Gaps**: Generate missing tokens based on variant's philosophy and trend research
2. **Fill Gaps**: Generate missing tokens based on variant's philosophy and design_attributes
3. **Maintain Style Identity**: Preserve unique characteristics from proposed tokens
4. **Semantic Naming**: Use clear names (e.g., "brand-primary" not "color-1")
5. **Accessibility**: Validate WCAG AA contrast using accessibility guidelines (4.5:1 text, 3:1 UI)
5. **Accessibility**: Validate WCAG AA contrast using built-in AI knowledge (4.5:1 text, 3:1 UI)
6. **OKLCH Format**: All colors use oklch(L C H / A) format
7. **Design Philosophy**: Expand variant's design philosophy using trend insights
8. **Trend Integration**: Incorporate modern trends from MCP research while maintaining variant identity
7. **Design Philosophy**: Expand variant's design philosophy based on its attributes
8. **Divergence Preservation**: Apply anti_keywords to prevent convergence with other variants
## Step 3: FILE WRITE OPERATIONS (CRITICAL)
**EXECUTION MODEL**: For EACH variant (1 to {variants_count}):
1. Perform MCP research
2. Refine tokens
1. Load design philosophy and attributes
2. Refine tokens using philosophy-driven strategy
3. **IMMEDIATELY Write() files - DO NOT accumulate, DO NOT return as text**
### Required Write Operations Per Variant
@@ -259,9 +253,9 @@ For variant {variant_id}, execute these Write() operations:
### Execution Checklist (Per Variant)
For each variant from 1 to {variants_count}:
- [ ] Extract variant data and design space keywords
- [ ] Execute 4 MCP queries (colors, typography, layout, contrast)
- [ ] Refine tokens using research + proposed_tokens
- [ ] Extract variant's philosophy, design_attributes, and anti_keywords
- [ ] Apply philosophy-driven refinement strategy to proposed_tokens
- [ ] Generate complete token set following refinement rules
- [ ] **EXECUTE**: `Write("{base_path}/style-consolidation/style-{variant_id}/design-tokens.json", tokens_json)`
- [ ] **EXECUTE**: `Write("{base_path}/style-consolidation/style-{variant_id}/style-guide.md", guide_content)`
- [ ] Verify both files written successfully
@@ -284,7 +278,7 @@ After completing all {variants_count} variants, report:
- style-guide.md: 7.9 KB
... (for all variants)
Summary: {variants_count} design systems generated with {total_mcp_queries} MCP queries
Summary: {variants_count} design systems generated with philosophy-driven refinement (zero MCP calls)
```
## KEY REMINDERS (CRITICAL)
@@ -294,7 +288,8 @@ Summary: {variants_count} design systems generated with {total_mcp_queries} MCP
- Write files immediately after generating content for each variant
- Verify each Write() operation succeeds before proceeding
- Use exact paths provided: `{base_path}/style-consolidation/style-{variant_id}/...`
- Execute MCP research independently for each variant
- Apply philosophy-driven refinement strategy for each variant
- Maintain variant divergence using design_attributes and anti_keywords
- Report completion with file paths and sizes
**NEVER:**
@@ -302,7 +297,7 @@ Summary: {variants_count} design systems generated with {total_mcp_queries} MCP
- Accumulate all content and try to output at once
- Skip Write() operations and expect orchestrator to write files
- Use relative paths or modify provided paths
- Skip MCP research if design_space_analysis is provided
- Use external research or MCP calls (pure AI refinement only)
- Generate variant N+1 before completing variant N writes
"""
@@ -358,7 +353,7 @@ TodoWrite({todos: [
{content: "Load session and style cards", status: "completed", activeForm: "Loading style cards"},
{content: "Select variants for consolidation", status: "completed", activeForm: "Selecting variants"},
{content: "Load design context and space analysis", status: "completed", activeForm: "Loading context"},
{content: "Perform variant-specific trend research", status: "completed", activeForm: "Researching design trends"},
{content: "Apply philosophy-driven refinement", status: "completed", activeForm: "Refining design tokens"},
{content: "Generate design systems via agent", status: "completed", activeForm: "Generating design systems"},
{content: "Process agent results and write files", status: "completed", activeForm: "Writing output files"}
]});
@@ -369,10 +364,11 @@ TodoWrite({todos: [
✅ Design system consolidation complete for session: {session_id}
{IF design_space_analysis:
🔍 Trend Research Performed:
- {variants_count} × 4 variant-specific MCP queries ({variants_count * 4} total)
- 1 shared accessibility research query
- Each variant refined with independent trend guidance
🎨 Philosophy-Driven Refinement:
- {variants_count} design systems generated from AI-analyzed philosophies
- Zero MCP calls (pure AI token refinement)
- Divergence preserved from extraction phase design_attributes
- Each variant maintains unique style identity via anti_keywords
}
Generated {variants_count} independent design systems:
@@ -428,22 +424,22 @@ Layout planning is now handled in the generate phase for each specific target.
## Key Features
1. **Variant-Specific Trend Research** 🆕 - Agent performs independent MCP queries for each variant (4 queries per variant); Uses design space analysis keywords from extraction phase; Each variant researches its specific design philosophy; Shared accessibility research applied to all variants; Eliminates convergence by maintaining variant-specific research
2. **Agent-Driven Architecture** - Uses ui-design-agent for multi-file generation and MCP research; Parallel generation of N independent design systems with external trend integration; Structured output parsing with labeled sections; Agent handles both research and synthesis
3. **Separate Design Systems (Matrix-Ready)** - Generates N independent design systems (one per variant); Each variant maintains unique style identity enhanced by trend research; Provides style foundation for style × layout matrix exploration in generate phase
4. **Token Refinement with Trend Integration** 🆕 - Reads `proposed_tokens` from style cards; Loads design space analysis for research parameters; Agent performs MCP trend research per variant; Refines tokens using research insights while maintaining style identity
5. **Complete Design System Output** - design-tokens.json (CSS tokens per variant); style-guide.md (documentation per variant with trend insights)
6. **Production-Ready Quality** - WCAG AA accessibility validation with MCP research; OKLCH color format for perceptual uniformity; Semantic token naming; Complete token coverage; Modern trends integration
7. **Streamlined Workflow** - Sequential phases with clear responsibilities; Agent handles MCP research, token refinement, and file generation; Reproducible with deterministic structure; Context-aware (integrates brainstorming and design space analysis)
8. **Clear Separation of Concerns** - Consolidation focuses on style systems with trend research; Extraction focuses on Claude-native analysis; Layout planning delegated to generate phase for target-specific optimization
1. **Philosophy-Driven Refinement** - Pure AI token refinement based on design_space_analysis from extraction phase; Uses variant-specific philosophies and design_attributes as refinement rules; Preserves maximum contrast without external trend pollution; Zero MCP calls = faster execution + better divergence preservation
2. **Agent-Driven Architecture** - Uses ui-design-agent for multi-file generation; Processes N variants with philosophy-guided synthesis; Structured output with deterministic token generation; Agent applies design attributes directly to token values
3. **Separate Design Systems (Matrix-Ready)** - Generates N independent design systems (one per variant); Each variant maintains unique style identity from extraction phase; Provides style foundation for style × layout matrix exploration in generate phase
4. **Token Refinement with AI Guidance** - Reads `proposed_tokens` from style cards; Loads design_space_analysis for philosophy and attributes; Applies attributes to token generation (saturation → chroma, density → spacing, etc.); Refines tokens while maintaining variant divergence through anti_keywords
5. **Complete Design System Output** - design-tokens.json (CSS tokens per variant); style-guide.md (documentation per variant with philosophy explanation)
6. **Production-Ready Quality** - WCAG AA accessibility validation using built-in AI knowledge (4.5:1 text, 3:1 UI); OKLCH color format for perceptual uniformity; Semantic token naming; Complete token coverage
7. **Streamlined Workflow** - Sequential phases with clear responsibilities; Agent handles philosophy-driven token refinement and file generation; Reproducible with deterministic structure; Context-aware (integrates brainstorming and design space analysis); ~30-60s faster without MCP overhead
8. **Divergence Preservation** - Strictly follows design_space_analysis constraints from extraction; Applies anti_keywords to prevent variant convergence; Maintains maximum variant contrast through attribute-driven generation; No external research = pure philosophical consistency
## Integration Points
- **Input**:
- `style-cards.json` from `/workflow:ui-design:extract` (with `proposed_tokens`)
- `design-space-analysis.json` from extraction phase (with search keywords for MCP research)
- `design-space-analysis.json` from extraction phase (with philosophy and design_attributes)
- `--variants` parameter (default: all variants)
- **Output**: Style Systems: `style-consolidation/style-{n}/design-tokens.json` and `style-guide.md` for each variant (enhanced with trend research)
- **Output**: Style Systems: `style-consolidation/style-{n}/design-tokens.json` and `style-guide.md` for each variant (refined with philosophy-driven approach)
- **Context**: Optional `synthesis-specification.md` or `ui-designer/analysis.md`
- **Auto Integration**: Automatically triggered by `/workflow:ui-design:explore-auto` workflow
- **Next Command**: `/workflow:ui-design:generate --style-variants N --targets "..." --layout-variants M` performs target-specific layout planning

View File

@@ -211,33 +211,63 @@ FOR target IN target_list:
REPORT: f"✅ Phase 1.5 complete: Verified {len(target_list) × layout_variants} target-specific layout files"
```
### Phase 1.6: Token Variable Name Extraction
### Phase 1.6: Convert Design Tokens to CSS
```bash
REPORT: "📋 Extracting design token variable names..."
tokens_json_path = "{base_path}/style-consolidation/style-1/design-tokens.json"
VERIFY: exists(tokens_json_path), "Design tokens not found. Run /workflow:ui-design:consolidate first."
REPORT: "🎨 Converting design tokens to CSS variables..."
design_tokens = Read(tokens_json_path)
# Check for jq dependency
IF NOT command_exists("jq"):
ERROR: "jq is not installed or not in PATH. The conversion script requires jq."
REPORT: "Please install jq: macOS: brew install jq | Linux: apt-get install jq | Windows: https://stedolan.github.io/jq/download/"
EXIT 1
# Extract all available token categories and variable names
token_reference = {
"colors": {"brand": list(keys), "surface": list(keys), "semantic": list(keys), "text": list(keys), "border": list(keys)},
"typography": {"font_family": list(keys), "font_size": list(keys), "font_weight": list(keys), "line_height": list(keys), "letter_spacing": list(keys)},
"spacing": list(keys), "border_radius": list(keys), "shadows": list(keys), "breakpoints": list(keys)
}
# Convert design tokens to CSS for each style variant
FOR style_id IN range(1, style_variants + 1):
tokens_json_path = "{base_path}/style-consolidation/style-${style_id}/design-tokens.json"
tokens_css_path = "{base_path}/style-consolidation/style-${style_id}/tokens.css"
script_path = "~/.claude/scripts/convert_tokens_to_css.sh"
# Generate complete variable name lists for Agent prompt
color_vars = []; FOR category, keys: FOR key: color_vars.append(f"--color-{category}-{key}")
typography_vars = []; FOR category, keys: FOR key: typography_vars.append(f"--{category.replace('_', '-')}-{key}")
spacing_vars = [f"--spacing-{key}" for key in token_reference.spacing]
radius_vars = [f"--border-radius-{key}" for key in token_reference.border_radius]
shadow_vars = [f"--shadow-{key}" for key in token_reference.shadows]
breakpoint_vars = [f"--breakpoint-{key}" for key in token_reference.breakpoints]
# Verify input file exists
VERIFY: exists(tokens_json_path), f"Design tokens not found for style-{style_id}"
all_token_vars = color_vars + typography_vars + spacing_vars + radius_vars + shadow_vars + breakpoint_vars
# Execute conversion: cat input.json | script.sh > output.css
Bash(cat "${tokens_json_path}" | "${script_path}" > "${tokens_css_path}")
REPORT: f"✅ Extracted {len(all_token_vars)} design token variables from design-tokens.json"
# Verify output was generated
IF exit_code == 0 AND exists(tokens_css_path):
file_size = get_file_size(tokens_css_path)
REPORT: f" ✓ Generated tokens.css for style-{style_id} ({file_size} KB)"
ELSE:
ERROR: f"Failed to generate tokens.css for style-{style_id}"
EXIT 1
REPORT: f"✅ Phase 1.6 complete: Converted {style_variants} design token files to CSS"
```
### Phase 1.7: Extract Token Variable Names from CSS
```bash
REPORT: "📋 Extracting actual CSS variable names from tokens.css..."
tokens_css_path = "{base_path}/style-consolidation/style-1/tokens.css"
VERIFY: exists(tokens_css_path), "tokens.css not found. Phase 1.6 may have failed."
tokens_css_content = Read(tokens_css_path)
# Extract all CSS variable names from the generated file
# Pattern: --variable-name: value;
all_token_vars = extract_css_variables(tokens_css_content) # Regex: r'--([a-z0-9-_]+):'
# Categorize variables for better Agent understanding
color_vars = [v for v in all_token_vars if v.startswith('--color-')]
typography_vars = [v for v in all_token_vars if v.startswith(('--font-', '--line-height-', '--letter-spacing-'))]
spacing_vars = [v for v in all_token_vars if v.startswith('--spacing-')]
radius_vars = [v for v in all_token_vars if v.startswith('--border-radius-')]
shadow_vars = [v for v in all_token_vars if v.startswith('--shadow-')]
breakpoint_vars = [v for v in all_token_vars if v.startswith('--breakpoint-')]
REPORT: f"✅ Extracted {len(all_token_vars)} actual CSS variables from tokens.css"
REPORT: f" Colors: {len(color_vars)} | Typography: {len(typography_vars)} | Spacing: {len(spacing_vars)}"
```
### Phase 2: Optimized Matrix UI Generation
@@ -290,16 +320,36 @@ FOR layout_id IN range(1, layout_variants + 1):
<link rel=\"stylesheet\" href=\"{{STRUCTURAL_CSS}}\">
<link rel=\"stylesheet\" href=\"{{TOKEN_CSS}}\">
## CSS Requirements
- Token-driven: ALL values use var() (zero hardcoded values)
## CSS Requirements - TOKEN REFERENCE (CRITICAL)
**STEP 1: Read the actual tokens.css file FIRST**
Read(\"{base_path}/style-consolidation/style-1/tokens.css\")
**STEP 2: Extract ALL CSS variable names**
- Pattern: Lines starting with \" --\" inside \":root {}\"
- Example: \" --color-brand-primary: oklch(...)\" → use \"--color-brand-primary\"
**STEP 3: Use ONLY variables that exist in tokens.css**
- ✅ DO: Copy exact variable names from tokens.css
- ✅ DO: Use var(--exact-name-from-file)
- ❌ DON'T: Invent or guess variable names
- ❌ DON'T: Use hardcoded values (colors, fonts, spacing)
**Available Token Categories** (extracted from actual file):
- Colors: {', '.join(color_vars[:5])}... ({len(color_vars)} total)
- Typography: {', '.join(typography_vars[:5])}... ({len(typography_vars)} total)
- Spacing: {', '.join(spacing_vars[:5])}... ({len(spacing_vars)} total)
- Radius: {', '.join(radius_vars[:3])}... ({len(radius_vars)} total)
- Shadows: {', '.join(shadow_vars)}
**Example Workflow**:
1. Read tokens.css → find \"--color-brand-primary: oklch(0.55 0.12 45);\"
2. Use in CSS → \"background: var(--color-brand-primary);\"
**CSS Rules**:
- Token-driven: ALL stylistic values use var() (zero hardcoded values)
- Mobile-first responsive design
- Available tokens ({len(all_token_vars)} variables):
* Colors: --color-brand-primary, --color-surface-background, --color-text-primary, ...
* Typography: --font-family-heading, --font-size-base, --font-weight-bold, ...
* Spacing: --spacing-0 through --spacing-24
* Radius: --border-radius-none, --border-radius-sm, ..., --border-radius-full
* Shadows: --shadow-sm, --shadow-md, --shadow-lg, --shadow-xl
* Breakpoints: --breakpoint-sm, --breakpoint-md, --breakpoint-lg
- Structural layout only (Flexbox, Grid, positioning)
## Write Operations
Write(\"{base_path}/prototypes/_templates/{target}-layout-{layout_id}.html\", html_content)
@@ -355,35 +405,16 @@ REPORT: "✅ Phase 2a.5 complete: Verified {layout_variants * len(target_list) *
```bash
REPORT: "🚀 Phase 2b: Instantiating prototypes from templates..."
# Step 1: Convert design tokens to CSS for each style
REPORT: " Converting design tokens to CSS variables..."
# Check for jq dependency
IF NOT command_exists("jq"):
ERROR: "jq is not installed or not in PATH. The conversion script requires jq."
REPORT: "Please install jq: macOS: brew install jq | Linux: apt-get install jq | Windows: https://stedolan.github.io/jq/download/"
EXIT 1
# Convert design tokens to CSS for each style variant
# Verify tokens.css files exist (should be created in Phase 1.6)
FOR style_id IN range(1, style_variants + 1):
tokens_json_path = "{base_path}/style-consolidation/style-${style_id}/design-tokens.json"
tokens_css_path = "{base_path}/style-consolidation/style-${style_id}/tokens.css"
script_path = "~/.claude/scripts/convert_tokens_to_css.sh"
VERIFY: exists(tokens_css_path), f"tokens.css missing for style-{style_id}. Phase 1.6 may have failed."
# Verify input file exists
IF NOT exists(tokens_json_path): REPORT: " ✗ ERROR: Input file not found"; CONTINUE
REPORT: " Verified {style_variants} tokens.css files exist"
# Execute conversion: cat input.json | script.sh > output.css
Bash(cat "${tokens_json_path}" | "${script_path}" > "${tokens_css_path}")
# Verify output was generated
IF exit_code == 0 AND exists(tokens_css_path):
REPORT: " ✓ Generated tokens.css for style-${style_id}"
ELSE:
REPORT: " ✗ ERROR: Failed to generate tokens.css for style-${style_id}"
# Step 2: Use ui-instantiate-prototypes.sh script for instantiation
prototypes_dir = "{base_path}/prototypes"; targets_csv = ','.join(target_list)
# Use ui-instantiate-prototypes.sh script for instantiation
prototypes_dir = "{base_path}/prototypes"
targets_csv = ','.join(target_list)
session_id = --session provided ? {session_id} : "standalone"
# Execute instantiation script with target type
@@ -475,9 +506,9 @@ Run `/workflow:ui-design:update` once all issues are resolved.
TodoWrite({todos: [
{content: "Resolve paths and load design systems", status: "completed", activeForm: "Loading design systems"},
{content: `Plan ${target_list.length}×${layout_variants} target-specific layouts`, status: "completed", activeForm: "Planning layouts"},
{content: "Extract design token variable names", status: "completed", activeForm: "Extracting token variables"},
{content: `Generate ${layout_variants}×${target_list.length} layout templates using target-specific plans`, status: "completed", activeForm: "Generating templates"},
{content: "Convert design tokens to CSS variables", status: "completed", activeForm: "Converting tokens"},
{content: `Convert ${style_variants} design token files to CSS`, status: "completed", activeForm: "Converting tokens to CSS"},
{content: "Extract CSS variable names from tokens.css", status: "completed", activeForm: "Extracting variable names"},
{content: `Generate ${layout_variants}×${target_list.length} layout templates (agent reads tokens.css)`, status: "completed", activeForm: "Generating templates"},
{content: `Instantiate ${style_variants}×${layout_variants}×${target_list.length} prototypes using script`, status: "completed", activeForm: "Running script"},
{content: "Verify preview files generation", status: "completed", activeForm: "Verifying files"}
]});