mirror of
https://github.com/cexll/myclaude.git
synced 2026-02-11 03:23:50 +08:00
Compare commits
10 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0ceb819419 | ||
|
|
4d69c8aef1 | ||
|
|
eec844d850 | ||
|
|
1f42bcc1c6 | ||
|
|
0f359b048f | ||
|
|
4e2df6a80e | ||
|
|
a30f434b5d | ||
|
|
41f4e21268 | ||
|
|
a67aa00c9a | ||
|
|
d61a0f9ffd |
22
.gitattributes
vendored
Normal file
22
.gitattributes
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
# Ensure shell scripts always use LF line endings on all platforms
|
||||||
|
*.sh text eol=lf
|
||||||
|
|
||||||
|
# Ensure Python files use LF line endings
|
||||||
|
*.py text eol=lf
|
||||||
|
|
||||||
|
# Auto-detect text files and normalize line endings to LF
|
||||||
|
* text=auto eol=lf
|
||||||
|
|
||||||
|
# Explicitly declare files that should always be treated as binary
|
||||||
|
*.exe binary
|
||||||
|
*.png binary
|
||||||
|
*.jpg binary
|
||||||
|
*.jpeg binary
|
||||||
|
*.gif binary
|
||||||
|
*.ico binary
|
||||||
|
*.mov binary
|
||||||
|
*.mp4 binary
|
||||||
|
*.mp3 binary
|
||||||
|
*.zip binary
|
||||||
|
*.gz binary
|
||||||
|
*.tar binary
|
||||||
156
README.md
156
README.md
@@ -7,7 +7,7 @@
|
|||||||
|
|
||||||
[](https://www.gnu.org/licenses/agpl-3.0)
|
[](https://www.gnu.org/licenses/agpl-3.0)
|
||||||
[](https://claude.ai/code)
|
[](https://claude.ai/code)
|
||||||
[](https://github.com/cexll/myclaude)
|
[](https://github.com/cexll/myclaude)
|
||||||
|
|
||||||
> AI-powered development automation with multi-backend execution (Codex/Claude/Gemini)
|
> AI-powered development automation with multi-backend execution (Codex/Claude/Gemini)
|
||||||
|
|
||||||
@@ -132,6 +132,59 @@ Requirements → Architecture → Sprint Plan → Development → Review → QA
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## Version Requirements
|
||||||
|
|
||||||
|
### Codex CLI
|
||||||
|
**Minimum version:** Check compatibility with your installation
|
||||||
|
|
||||||
|
The codeagent-wrapper uses these Codex CLI features:
|
||||||
|
- `codex e` - Execute commands (shorthand for `codex exec`)
|
||||||
|
- `--skip-git-repo-check` - Skip git repository validation
|
||||||
|
- `--json` - JSON stream output format
|
||||||
|
- `-C <workdir>` - Set working directory
|
||||||
|
- `resume <session_id>` - Resume previous sessions
|
||||||
|
|
||||||
|
**Verify Codex CLI is installed:**
|
||||||
|
```bash
|
||||||
|
which codex
|
||||||
|
codex --version
|
||||||
|
```
|
||||||
|
|
||||||
|
### Claude CLI
|
||||||
|
**Minimum version:** Check compatibility with your installation
|
||||||
|
|
||||||
|
Required features:
|
||||||
|
- `--output-format stream-json` - Streaming JSON output format
|
||||||
|
- `--setting-sources` - Control setting sources (prevents infinite recursion)
|
||||||
|
- `--dangerously-skip-permissions` - Skip permission prompts (use with caution)
|
||||||
|
- `-p` - Prompt input flag
|
||||||
|
- `-r <session_id>` - Resume sessions
|
||||||
|
|
||||||
|
**Security Note:** The wrapper only adds `--dangerously-skip-permissions` for Claude when explicitly enabled (e.g. `--skip-permissions` / `CODEAGENT_SKIP_PERMISSIONS=true`). Keep it disabled unless you understand the risk.
|
||||||
|
|
||||||
|
**Verify Claude CLI is installed:**
|
||||||
|
```bash
|
||||||
|
which claude
|
||||||
|
claude --version
|
||||||
|
```
|
||||||
|
|
||||||
|
### Gemini CLI
|
||||||
|
**Minimum version:** Check compatibility with your installation
|
||||||
|
|
||||||
|
Required features:
|
||||||
|
- `-o stream-json` - JSON stream output format
|
||||||
|
- `-y` - Auto-approve prompts (non-interactive mode)
|
||||||
|
- `-r <session_id>` - Resume sessions
|
||||||
|
- `-p` - Prompt input flag
|
||||||
|
|
||||||
|
**Verify Gemini CLI is installed:**
|
||||||
|
```bash
|
||||||
|
which gemini
|
||||||
|
gemini --version
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
### Modular Installation (Recommended)
|
### Modular Installation (Recommended)
|
||||||
@@ -163,15 +216,39 @@ python3 install.py --force
|
|||||||
|
|
||||||
```
|
```
|
||||||
~/.claude/
|
~/.claude/
|
||||||
├── CLAUDE.md # Core instructions and role definition
|
├── bin/
|
||||||
├── commands/ # Slash commands (/dev, /code, etc.)
|
│ └── codeagent-wrapper # Main executable
|
||||||
├── agents/ # Agent definitions
|
├── CLAUDE.md # Core instructions and role definition
|
||||||
|
├── commands/ # Slash commands (/dev, /code, etc.)
|
||||||
|
├── agents/ # Agent definitions
|
||||||
├── skills/
|
├── skills/
|
||||||
│ └── codex/
|
│ └── codex/
|
||||||
│ └── SKILL.md # Codex integration skill
|
│ └── SKILL.md # Codex integration skill
|
||||||
└── installed_modules.json # Installation status
|
├── config.json # Configuration
|
||||||
|
└── installed_modules.json # Installation status
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Customizing Installation Directory
|
||||||
|
|
||||||
|
By default, myclaude installs to `~/.claude`. You can customize this using the `INSTALL_DIR` environment variable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install to custom directory
|
||||||
|
INSTALL_DIR=/opt/myclaude bash install.sh
|
||||||
|
|
||||||
|
# Update your PATH accordingly
|
||||||
|
export PATH="/opt/myclaude/bin:$PATH"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Directory Structure:**
|
||||||
|
- `$INSTALL_DIR/bin/` - codeagent-wrapper binary
|
||||||
|
- `$INSTALL_DIR/skills/` - Skill definitions
|
||||||
|
- `$INSTALL_DIR/config.json` - Configuration file
|
||||||
|
- `$INSTALL_DIR/commands/` - Slash command definitions
|
||||||
|
- `$INSTALL_DIR/agents/` - Agent definitions
|
||||||
|
|
||||||
|
**Note:** When using a custom installation directory, ensure that `$INSTALL_DIR/bin` is added to your `PATH` environment variable.
|
||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
|
|
||||||
Edit `config.json` to customize:
|
Edit `config.json` to customize:
|
||||||
@@ -295,7 +372,7 @@ setx PATH "%USERPROFILE%\bin;%PATH%"
|
|||||||
**Codex wrapper not found:**
|
**Codex wrapper not found:**
|
||||||
```bash
|
```bash
|
||||||
# Check PATH
|
# Check PATH
|
||||||
echo $PATH | grep -q "$HOME/bin" || echo 'export PATH="$HOME/bin:$PATH"' >> ~/.zshrc
|
echo $PATH | grep -q "$HOME/.claude/bin" || echo 'export PATH="$HOME/.claude/bin:$PATH"' >> ~/.zshrc
|
||||||
|
|
||||||
# Reinstall
|
# Reinstall
|
||||||
bash install.sh
|
bash install.sh
|
||||||
@@ -315,6 +392,71 @@ cat ~/.claude/installed_modules.json
|
|||||||
python3 install.py --module dev --force
|
python3 install.py --module dev --force
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Version Compatibility Issues
|
||||||
|
|
||||||
|
**Backend CLI not found:**
|
||||||
|
```bash
|
||||||
|
# Check if backend CLIs are installed
|
||||||
|
which codex
|
||||||
|
which claude
|
||||||
|
which gemini
|
||||||
|
|
||||||
|
# Install missing backends
|
||||||
|
# Codex: Follow installation instructions at https://codex.docs
|
||||||
|
# Claude: Follow installation instructions at https://claude.ai/docs
|
||||||
|
# Gemini: Follow installation instructions at https://ai.google.dev/docs
|
||||||
|
```
|
||||||
|
|
||||||
|
**Unsupported CLI flags:**
|
||||||
|
```bash
|
||||||
|
# If you see errors like "unknown flag" or "invalid option"
|
||||||
|
|
||||||
|
# Check backend CLI version
|
||||||
|
codex --version
|
||||||
|
claude --version
|
||||||
|
gemini --version
|
||||||
|
|
||||||
|
# For Codex: Ensure it supports `e`, `--skip-git-repo-check`, `--json`, `-C`, and `resume`
|
||||||
|
# For Claude: Ensure it supports `--output-format stream-json`, `--setting-sources`, `-r`
|
||||||
|
# For Gemini: Ensure it supports `-o stream-json`, `-y`, `-r`, `-p`
|
||||||
|
|
||||||
|
# Update your backend CLI to the latest version if needed
|
||||||
|
```
|
||||||
|
|
||||||
|
**JSON parsing errors:**
|
||||||
|
```bash
|
||||||
|
# If you see "failed to parse JSON output" errors
|
||||||
|
|
||||||
|
# Verify the backend outputs stream-json format
|
||||||
|
codex e --json "test task" # Should output newline-delimited JSON
|
||||||
|
claude --output-format stream-json -p "test" # Should output stream JSON
|
||||||
|
|
||||||
|
# If not, your backend CLI version may be too old or incompatible
|
||||||
|
```
|
||||||
|
|
||||||
|
**Infinite recursion with Claude backend:**
|
||||||
|
```bash
|
||||||
|
# The wrapper prevents this with `--setting-sources ""` flag
|
||||||
|
# If you still see recursion, ensure your Claude CLI supports this flag
|
||||||
|
|
||||||
|
claude --help | grep "setting-sources"
|
||||||
|
|
||||||
|
# If flag is not supported, upgrade Claude CLI
|
||||||
|
```
|
||||||
|
|
||||||
|
**Session resume failures:**
|
||||||
|
```bash
|
||||||
|
# Check if session ID is valid
|
||||||
|
codex history # List recent sessions
|
||||||
|
claude history
|
||||||
|
|
||||||
|
# Ensure backend CLI supports session resumption
|
||||||
|
codex resume <session_id> "test" # Should continue from previous session
|
||||||
|
claude -r <session_id> "test"
|
||||||
|
|
||||||
|
# If not supported, use new sessions instead of resume mode
|
||||||
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Documentation
|
## Documentation
|
||||||
|
|||||||
38
README_CN.md
38
README_CN.md
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
[](https://www.gnu.org/licenses/agpl-3.0)
|
[](https://www.gnu.org/licenses/agpl-3.0)
|
||||||
[](https://claude.ai/code)
|
[](https://claude.ai/code)
|
||||||
[](https://github.com/cexll/myclaude)
|
[](https://github.com/cexll/myclaude)
|
||||||
|
|
||||||
> AI 驱动的开发自动化 - 多后端执行架构 (Codex/Claude/Gemini)
|
> AI 驱动的开发自动化 - 多后端执行架构 (Codex/Claude/Gemini)
|
||||||
|
|
||||||
@@ -152,15 +152,39 @@ python3 install.py --force
|
|||||||
|
|
||||||
```
|
```
|
||||||
~/.claude/
|
~/.claude/
|
||||||
├── CLAUDE.md # 核心指令和角色定义
|
├── bin/
|
||||||
├── commands/ # 斜杠命令 (/dev, /code 等)
|
│ └── codeagent-wrapper # 主可执行文件
|
||||||
├── agents/ # 智能体定义
|
├── CLAUDE.md # 核心指令和角色定义
|
||||||
|
├── commands/ # 斜杠命令 (/dev, /code 等)
|
||||||
|
├── agents/ # 智能体定义
|
||||||
├── skills/
|
├── skills/
|
||||||
│ └── codex/
|
│ └── codex/
|
||||||
│ └── SKILL.md # Codex 集成技能
|
│ └── SKILL.md # Codex 集成技能
|
||||||
└── installed_modules.json # 安装状态
|
├── config.json # 配置文件
|
||||||
|
└── installed_modules.json # 安装状态
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### 自定义安装目录
|
||||||
|
|
||||||
|
默认情况下,myclaude 安装到 `~/.claude`。您可以使用 `INSTALL_DIR` 环境变量自定义安装目录:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 安装到自定义目录
|
||||||
|
INSTALL_DIR=/opt/myclaude bash install.sh
|
||||||
|
|
||||||
|
# 相应更新您的 PATH
|
||||||
|
export PATH="/opt/myclaude/bin:$PATH"
|
||||||
|
```
|
||||||
|
|
||||||
|
**目录结构:**
|
||||||
|
- `$INSTALL_DIR/bin/` - codeagent-wrapper 可执行文件
|
||||||
|
- `$INSTALL_DIR/skills/` - 技能定义
|
||||||
|
- `$INSTALL_DIR/config.json` - 配置文件
|
||||||
|
- `$INSTALL_DIR/commands/` - 斜杠命令定义
|
||||||
|
- `$INSTALL_DIR/agents/` - 智能体定义
|
||||||
|
|
||||||
|
**注意:** 使用自定义安装目录时,请确保将 `$INSTALL_DIR/bin` 添加到您的 `PATH` 环境变量中。
|
||||||
|
|
||||||
### 配置
|
### 配置
|
||||||
|
|
||||||
编辑 `config.json` 自定义:
|
编辑 `config.json` 自定义:
|
||||||
@@ -284,7 +308,7 @@ setx PATH "%USERPROFILE%\bin;%PATH%"
|
|||||||
**Codex wrapper 未找到:**
|
**Codex wrapper 未找到:**
|
||||||
```bash
|
```bash
|
||||||
# 检查 PATH
|
# 检查 PATH
|
||||||
echo $PATH | grep -q "$HOME/bin" || echo 'export PATH="$HOME/bin:$PATH"' >> ~/.zshrc
|
echo $PATH | grep -q "$HOME/.claude/bin" || echo 'export PATH="$HOME/.claude/bin:$PATH"' >> ~/.zshrc
|
||||||
|
|
||||||
# 重新安装
|
# 重新安装
|
||||||
bash install.sh
|
bash install.sh
|
||||||
|
|||||||
@@ -427,6 +427,10 @@ Generate architecture document at `./.claude/specs/{feature_name}/02-system-arch
|
|||||||
|
|
||||||
## Important Behaviors
|
## Important Behaviors
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, REST, GraphQL, JWT, RBAC, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Start by reviewing and referencing the PRD
|
- Start by reviewing and referencing the PRD
|
||||||
- Present initial architecture based on requirements
|
- Present initial architecture based on requirements
|
||||||
|
|||||||
@@ -419,6 +419,10 @@ logger.info('User created', {
|
|||||||
|
|
||||||
## Important Implementation Rules
|
## Important Implementation Rules
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, CRUD, JWT, SQL, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Follow architecture specifications exactly
|
- Follow architecture specifications exactly
|
||||||
- Implement all acceptance criteria from PRD
|
- Implement all acceptance criteria from PRD
|
||||||
|
|||||||
@@ -22,6 +22,10 @@ You are the BMAD Orchestrator. Your core focus is repository analysis, workflow
|
|||||||
- Consistency: ensure conventions and patterns discovered in scan are preserved downstream
|
- Consistency: ensure conventions and patterns discovered in scan are preserved downstream
|
||||||
- Explicit handoffs: clearly document assumptions, risks, and integration points for other agents
|
- Explicit handoffs: clearly document assumptions, risks, and integration points for other agents
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, PRD, Sprint, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
## UltraThink Repository Scan
|
## UltraThink Repository Scan
|
||||||
|
|
||||||
When asked to analyze the repository, follow this structure and return a clear, actionable summary.
|
When asked to analyze the repository, follow this structure and return a clear, actionable summary.
|
||||||
|
|||||||
@@ -313,6 +313,10 @@ Generate PRD at `./.claude/specs/{feature_name}/01-product-requirements.md`:
|
|||||||
|
|
||||||
## Important Behaviors
|
## Important Behaviors
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, Sprint, PRD, KPI, MVP, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Start immediately with greeting and initial understanding
|
- Start immediately with greeting and initial understanding
|
||||||
- Show quality scores transparently
|
- Show quality scores transparently
|
||||||
|
|||||||
@@ -478,6 +478,10 @@ module.exports = {
|
|||||||
|
|
||||||
## Important Testing Rules
|
## Important Testing Rules
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, E2E, CI/CD, Mock, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Test all acceptance criteria from PRD
|
- Test all acceptance criteria from PRD
|
||||||
- Cover happy path, edge cases, and error scenarios
|
- Cover happy path, edge cases, and error scenarios
|
||||||
|
|||||||
@@ -45,3 +45,7 @@ You are an independent code review agent responsible for conducting reviews betw
|
|||||||
- Focus on actionable findings
|
- Focus on actionable findings
|
||||||
- Provide specific QA guidance
|
- Provide specific QA guidance
|
||||||
- Use clear, parseable output format
|
- Use clear, parseable output format
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, PRD, Sprint, etc.) in English; translate explanatory text only.
|
||||||
|
|||||||
@@ -351,6 +351,10 @@ So that [benefit]
|
|||||||
|
|
||||||
## Important Behaviors
|
## Important Behaviors
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (Sprint, Epic, Story, Backlog, Velocity, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Read both PRD and Architecture documents thoroughly
|
- Read both PRD and Architecture documents thoroughly
|
||||||
- Create comprehensive task breakdown
|
- Create comprehensive task breakdown
|
||||||
|
|||||||
@@ -1,5 +1,11 @@
|
|||||||
package main
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
)
|
||||||
|
|
||||||
// Backend defines the contract for invoking different AI CLI backends.
|
// Backend defines the contract for invoking different AI CLI backends.
|
||||||
// Each backend is responsible for supplying the executable command and
|
// Each backend is responsible for supplying the executable command and
|
||||||
// building the argument list based on the wrapper config.
|
// building the argument list based on the wrapper config.
|
||||||
@@ -26,15 +32,62 @@ func (ClaudeBackend) Command() string {
|
|||||||
return "claude"
|
return "claude"
|
||||||
}
|
}
|
||||||
func (ClaudeBackend) BuildArgs(cfg *Config, targetArg string) []string {
|
func (ClaudeBackend) BuildArgs(cfg *Config, targetArg string) []string {
|
||||||
|
return buildClaudeArgs(cfg, targetArg)
|
||||||
|
}
|
||||||
|
|
||||||
|
const maxClaudeSettingsBytes = 1 << 20 // 1MB
|
||||||
|
|
||||||
|
// loadMinimalEnvSettings 从 ~/.claude/setting.json 只提取 env 配置。
|
||||||
|
// 只接受字符串类型的值;文件缺失/解析失败/超限都返回空。
|
||||||
|
func loadMinimalEnvSettings() map[string]string {
|
||||||
|
home, err := os.UserHomeDir()
|
||||||
|
if err != nil || home == "" {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
settingPath := filepath.Join(home, ".claude", "setting.json")
|
||||||
|
info, err := os.Stat(settingPath)
|
||||||
|
if err != nil || info.Size() > maxClaudeSettingsBytes {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
data, err := os.ReadFile(settingPath)
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
var cfg struct {
|
||||||
|
Env map[string]any `json:"env"`
|
||||||
|
}
|
||||||
|
if err := json.Unmarshal(data, &cfg); err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
if len(cfg.Env) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
env := make(map[string]string, len(cfg.Env))
|
||||||
|
for k, v := range cfg.Env {
|
||||||
|
s, ok := v.(string)
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
env[k] = s
|
||||||
|
}
|
||||||
|
if len(env) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return env
|
||||||
|
}
|
||||||
|
|
||||||
|
func buildClaudeArgs(cfg *Config, targetArg string) []string {
|
||||||
if cfg == nil {
|
if cfg == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
args := []string{"-p", "--dangerously-skip-permissions"}
|
args := []string{"-p"}
|
||||||
|
if cfg.SkipPermissions {
|
||||||
// Only skip permissions when explicitly requested
|
args = append(args, "--dangerously-skip-permissions")
|
||||||
// if cfg.SkipPermissions {
|
}
|
||||||
// args = append(args, "--dangerously-skip-permissions")
|
|
||||||
// }
|
|
||||||
|
|
||||||
// Prevent infinite recursion: disable all setting sources (user, project, local)
|
// Prevent infinite recursion: disable all setting sources (user, project, local)
|
||||||
// This ensures a clean execution environment without CLAUDE.md or skills that would trigger codeagent
|
// This ensures a clean execution environment without CLAUDE.md or skills that would trigger codeagent
|
||||||
@@ -60,6 +113,10 @@ func (GeminiBackend) Command() string {
|
|||||||
return "gemini"
|
return "gemini"
|
||||||
}
|
}
|
||||||
func (GeminiBackend) BuildArgs(cfg *Config, targetArg string) []string {
|
func (GeminiBackend) BuildArgs(cfg *Config, targetArg string) []string {
|
||||||
|
return buildGeminiArgs(cfg, targetArg)
|
||||||
|
}
|
||||||
|
|
||||||
|
func buildGeminiArgs(cfg *Config, targetArg string) []string {
|
||||||
if cfg == nil {
|
if cfg == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,9 @@
|
|||||||
package main
|
package main
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"bytes"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
"reflect"
|
"reflect"
|
||||||
"testing"
|
"testing"
|
||||||
)
|
)
|
||||||
@@ -8,16 +11,16 @@ import (
|
|||||||
func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
|
func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
|
||||||
backend := ClaudeBackend{}
|
backend := ClaudeBackend{}
|
||||||
|
|
||||||
t.Run("new mode uses workdir without skip by default", func(t *testing.T) {
|
t.Run("new mode omits skip-permissions by default", func(t *testing.T) {
|
||||||
cfg := &Config{Mode: "new", WorkDir: "/repo"}
|
cfg := &Config{Mode: "new", WorkDir: "/repo"}
|
||||||
got := backend.BuildArgs(cfg, "todo")
|
got := backend.BuildArgs(cfg, "todo")
|
||||||
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"}
|
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"}
|
||||||
if !reflect.DeepEqual(got, want) {
|
if !reflect.DeepEqual(got, want) {
|
||||||
t.Fatalf("got %v, want %v", got, want)
|
t.Fatalf("got %v, want %v", got, want)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
t.Run("new mode opt-in skip permissions with default workdir", func(t *testing.T) {
|
t.Run("new mode can opt-in skip-permissions", func(t *testing.T) {
|
||||||
cfg := &Config{Mode: "new", SkipPermissions: true}
|
cfg := &Config{Mode: "new", SkipPermissions: true}
|
||||||
got := backend.BuildArgs(cfg, "-")
|
got := backend.BuildArgs(cfg, "-")
|
||||||
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "-"}
|
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "-"}
|
||||||
@@ -26,10 +29,10 @@ func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
|
|||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
t.Run("resume mode uses session id and omits workdir", func(t *testing.T) {
|
t.Run("resume mode includes session id", func(t *testing.T) {
|
||||||
cfg := &Config{Mode: "resume", SessionID: "sid-123", WorkDir: "/ignored"}
|
cfg := &Config{Mode: "resume", SessionID: "sid-123", WorkDir: "/ignored"}
|
||||||
got := backend.BuildArgs(cfg, "resume-task")
|
got := backend.BuildArgs(cfg, "resume-task")
|
||||||
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "-r", "sid-123", "--output-format", "stream-json", "--verbose", "resume-task"}
|
want := []string{"-p", "--setting-sources", "", "-r", "sid-123", "--output-format", "stream-json", "--verbose", "resume-task"}
|
||||||
if !reflect.DeepEqual(got, want) {
|
if !reflect.DeepEqual(got, want) {
|
||||||
t.Fatalf("got %v, want %v", got, want)
|
t.Fatalf("got %v, want %v", got, want)
|
||||||
}
|
}
|
||||||
@@ -38,7 +41,16 @@ func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
|
|||||||
t.Run("resume mode without session still returns base flags", func(t *testing.T) {
|
t.Run("resume mode without session still returns base flags", func(t *testing.T) {
|
||||||
cfg := &Config{Mode: "resume", WorkDir: "/ignored"}
|
cfg := &Config{Mode: "resume", WorkDir: "/ignored"}
|
||||||
got := backend.BuildArgs(cfg, "follow-up")
|
got := backend.BuildArgs(cfg, "follow-up")
|
||||||
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "follow-up"}
|
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "follow-up"}
|
||||||
|
if !reflect.DeepEqual(got, want) {
|
||||||
|
t.Fatalf("got %v, want %v", got, want)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("resume mode can opt-in skip permissions", func(t *testing.T) {
|
||||||
|
cfg := &Config{Mode: "resume", SessionID: "sid-123", SkipPermissions: true}
|
||||||
|
got := backend.BuildArgs(cfg, "resume-task")
|
||||||
|
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "-r", "sid-123", "--output-format", "stream-json", "--verbose", "resume-task"}
|
||||||
if !reflect.DeepEqual(got, want) {
|
if !reflect.DeepEqual(got, want) {
|
||||||
t.Fatalf("got %v, want %v", got, want)
|
t.Fatalf("got %v, want %v", got, want)
|
||||||
}
|
}
|
||||||
@@ -89,7 +101,11 @@ func TestClaudeBuildArgs_GeminiAndCodexModes(t *testing.T) {
|
|||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
t.Run("codex build args passthrough remains intact", func(t *testing.T) {
|
t.Run("codex build args omits bypass flag by default", func(t *testing.T) {
|
||||||
|
const key = "CODEX_BYPASS_SANDBOX"
|
||||||
|
t.Cleanup(func() { os.Unsetenv(key) })
|
||||||
|
os.Unsetenv(key)
|
||||||
|
|
||||||
backend := CodexBackend{}
|
backend := CodexBackend{}
|
||||||
cfg := &Config{Mode: "new", WorkDir: "/tmp"}
|
cfg := &Config{Mode: "new", WorkDir: "/tmp"}
|
||||||
got := backend.BuildArgs(cfg, "task")
|
got := backend.BuildArgs(cfg, "task")
|
||||||
@@ -98,6 +114,20 @@ func TestClaudeBuildArgs_GeminiAndCodexModes(t *testing.T) {
|
|||||||
t.Fatalf("got %v, want %v", got, want)
|
t.Fatalf("got %v, want %v", got, want)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
t.Run("codex build args includes bypass flag when enabled", func(t *testing.T) {
|
||||||
|
const key = "CODEX_BYPASS_SANDBOX"
|
||||||
|
t.Cleanup(func() { os.Unsetenv(key) })
|
||||||
|
os.Setenv(key, "true")
|
||||||
|
|
||||||
|
backend := CodexBackend{}
|
||||||
|
cfg := &Config{Mode: "new", WorkDir: "/tmp"}
|
||||||
|
got := backend.BuildArgs(cfg, "task")
|
||||||
|
want := []string{"e", "--dangerously-bypass-approvals-and-sandbox", "--skip-git-repo-check", "-C", "/tmp", "--json", "task"}
|
||||||
|
if !reflect.DeepEqual(got, want) {
|
||||||
|
t.Fatalf("got %v, want %v", got, want)
|
||||||
|
}
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestClaudeBuildArgs_BackendMetadata(t *testing.T) {
|
func TestClaudeBuildArgs_BackendMetadata(t *testing.T) {
|
||||||
@@ -120,3 +150,64 @@ func TestClaudeBuildArgs_BackendMetadata(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestLoadMinimalEnvSettings(t *testing.T) {
|
||||||
|
home := t.TempDir()
|
||||||
|
t.Setenv("HOME", home)
|
||||||
|
t.Setenv("USERPROFILE", home)
|
||||||
|
|
||||||
|
t.Run("missing file returns empty", func(t *testing.T) {
|
||||||
|
if got := loadMinimalEnvSettings(); len(got) != 0 {
|
||||||
|
t.Fatalf("got %v, want empty", got)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("valid env returns string map", func(t *testing.T) {
|
||||||
|
dir := filepath.Join(home, ".claude")
|
||||||
|
if err := os.MkdirAll(dir, 0o755); err != nil {
|
||||||
|
t.Fatalf("MkdirAll: %v", err)
|
||||||
|
}
|
||||||
|
path := filepath.Join(dir, "setting.json")
|
||||||
|
data := []byte(`{"env":{"ANTHROPIC_API_KEY":"secret","FOO":"bar"}}`)
|
||||||
|
if err := os.WriteFile(path, data, 0o600); err != nil {
|
||||||
|
t.Fatalf("WriteFile: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
got := loadMinimalEnvSettings()
|
||||||
|
if got["ANTHROPIC_API_KEY"] != "secret" || got["FOO"] != "bar" {
|
||||||
|
t.Fatalf("got %v, want keys present", got)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("non-string values are ignored", func(t *testing.T) {
|
||||||
|
dir := filepath.Join(home, ".claude")
|
||||||
|
path := filepath.Join(dir, "setting.json")
|
||||||
|
data := []byte(`{"env":{"GOOD":"ok","BAD":123,"ALSO_BAD":true}}`)
|
||||||
|
if err := os.WriteFile(path, data, 0o600); err != nil {
|
||||||
|
t.Fatalf("WriteFile: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
got := loadMinimalEnvSettings()
|
||||||
|
if got["GOOD"] != "ok" {
|
||||||
|
t.Fatalf("got %v, want GOOD=ok", got)
|
||||||
|
}
|
||||||
|
if _, ok := got["BAD"]; ok {
|
||||||
|
t.Fatalf("got %v, want BAD omitted", got)
|
||||||
|
}
|
||||||
|
if _, ok := got["ALSO_BAD"]; ok {
|
||||||
|
t.Fatalf("got %v, want ALSO_BAD omitted", got)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("oversized file returns empty", func(t *testing.T) {
|
||||||
|
dir := filepath.Join(home, ".claude")
|
||||||
|
path := filepath.Join(dir, "setting.json")
|
||||||
|
data := bytes.Repeat([]byte("a"), maxClaudeSettingsBytes+1)
|
||||||
|
if err := os.WriteFile(path, data, 0o600); err != nil {
|
||||||
|
t.Fatalf("WriteFile: %v", err)
|
||||||
|
}
|
||||||
|
if got := loadMinimalEnvSettings(); len(got) != 0 {
|
||||||
|
t.Fatalf("got %v, want empty", got)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|||||||
@@ -13,6 +13,16 @@ import (
|
|||||||
"time"
|
"time"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
func stripTimestampPrefix(line string) string {
|
||||||
|
if !strings.HasPrefix(line, "[") {
|
||||||
|
return line
|
||||||
|
}
|
||||||
|
if idx := strings.Index(line, "] "); idx >= 0 {
|
||||||
|
return line[idx+2:]
|
||||||
|
}
|
||||||
|
return line
|
||||||
|
}
|
||||||
|
|
||||||
// TestConcurrentStressLogger 高并发压力测试
|
// TestConcurrentStressLogger 高并发压力测试
|
||||||
func TestConcurrentStressLogger(t *testing.T) {
|
func TestConcurrentStressLogger(t *testing.T) {
|
||||||
if testing.Short() {
|
if testing.Short() {
|
||||||
@@ -79,7 +89,8 @@ func TestConcurrentStressLogger(t *testing.T) {
|
|||||||
// 验证日志格式(纯文本,无前缀)
|
// 验证日志格式(纯文本,无前缀)
|
||||||
formatRE := regexp.MustCompile(`^goroutine-\d+-msg-\d+$`)
|
formatRE := regexp.MustCompile(`^goroutine-\d+-msg-\d+$`)
|
||||||
for i, line := range lines[:min(10, len(lines))] {
|
for i, line := range lines[:min(10, len(lines))] {
|
||||||
if !formatRE.MatchString(line) {
|
msg := stripTimestampPrefix(line)
|
||||||
|
if !formatRE.MatchString(msg) {
|
||||||
t.Errorf("line %d has invalid format: %s", i, line)
|
t.Errorf("line %d has invalid format: %s", i, line)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -291,7 +302,7 @@ func TestLoggerOrderPreservation(t *testing.T) {
|
|||||||
sequences := make(map[int][]int) // goroutine ID -> sequence numbers
|
sequences := make(map[int][]int) // goroutine ID -> sequence numbers
|
||||||
|
|
||||||
for scanner.Scan() {
|
for scanner.Scan() {
|
||||||
line := scanner.Text()
|
line := stripTimestampPrefix(scanner.Text())
|
||||||
var gid, seq int
|
var gid, seq int
|
||||||
// Parse format: G0-SEQ0001 (without INFO: prefix)
|
// Parse format: G0-SEQ0001 (without INFO: prefix)
|
||||||
_, err := fmt.Sscanf(line, "G%d-SEQ%04d", &gid, &seq)
|
_, err := fmt.Sscanf(line, "G%d-SEQ%04d", &gid, &seq)
|
||||||
|
|||||||
@@ -164,6 +164,9 @@ func parseParallelConfig(data []byte) (*ParallelConfig, error) {
|
|||||||
if content == "" {
|
if content == "" {
|
||||||
return nil, fmt.Errorf("task block #%d (%q) missing content", taskIndex, task.ID)
|
return nil, fmt.Errorf("task block #%d (%q) missing content", taskIndex, task.ID)
|
||||||
}
|
}
|
||||||
|
if task.Mode == "resume" && strings.TrimSpace(task.SessionID) == "" {
|
||||||
|
return nil, fmt.Errorf("task block #%d (%q) has empty session_id", taskIndex, task.ID)
|
||||||
|
}
|
||||||
if _, exists := seen[task.ID]; exists {
|
if _, exists := seen[task.ID]; exists {
|
||||||
return nil, fmt.Errorf("task block #%d has duplicate id: %s", taskIndex, task.ID)
|
return nil, fmt.Errorf("task block #%d has duplicate id: %s", taskIndex, task.ID)
|
||||||
}
|
}
|
||||||
@@ -232,7 +235,10 @@ func parseArgs() (*Config, error) {
|
|||||||
return nil, fmt.Errorf("resume mode requires: resume <session_id> <task>")
|
return nil, fmt.Errorf("resume mode requires: resume <session_id> <task>")
|
||||||
}
|
}
|
||||||
cfg.Mode = "resume"
|
cfg.Mode = "resume"
|
||||||
cfg.SessionID = args[1]
|
cfg.SessionID = strings.TrimSpace(args[1])
|
||||||
|
if cfg.SessionID == "" {
|
||||||
|
return nil, fmt.Errorf("resume mode requires non-empty session_id")
|
||||||
|
}
|
||||||
cfg.Task = args[2]
|
cfg.Task = args[2]
|
||||||
cfg.ExplicitStdin = (args[2] == "-")
|
cfg.ExplicitStdin = (args[2] == "-")
|
||||||
if len(args) > 3 {
|
if len(args) > 3 {
|
||||||
|
|||||||
@@ -16,6 +16,8 @@ import (
|
|||||||
"time"
|
"time"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
const postMessageTerminateDelay = 1 * time.Second
|
||||||
|
|
||||||
// commandRunner abstracts exec.Cmd for testability
|
// commandRunner abstracts exec.Cmd for testability
|
||||||
type commandRunner interface {
|
type commandRunner interface {
|
||||||
Start() error
|
Start() error
|
||||||
@@ -24,6 +26,7 @@ type commandRunner interface {
|
|||||||
StdinPipe() (io.WriteCloser, error)
|
StdinPipe() (io.WriteCloser, error)
|
||||||
SetStderr(io.Writer)
|
SetStderr(io.Writer)
|
||||||
SetDir(string)
|
SetDir(string)
|
||||||
|
SetEnv(env map[string]string)
|
||||||
Process() processHandle
|
Process() processHandle
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -79,6 +82,52 @@ func (r *realCmd) SetDir(dir string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (r *realCmd) SetEnv(env map[string]string) {
|
||||||
|
if r == nil || r.cmd == nil || len(env) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
merged := make(map[string]string, len(env)+len(os.Environ()))
|
||||||
|
for _, kv := range os.Environ() {
|
||||||
|
if kv == "" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
idx := strings.IndexByte(kv, '=')
|
||||||
|
if idx <= 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
merged[kv[:idx]] = kv[idx+1:]
|
||||||
|
}
|
||||||
|
for _, kv := range r.cmd.Env {
|
||||||
|
if kv == "" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
idx := strings.IndexByte(kv, '=')
|
||||||
|
if idx <= 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
merged[kv[:idx]] = kv[idx+1:]
|
||||||
|
}
|
||||||
|
for k, v := range env {
|
||||||
|
if strings.TrimSpace(k) == "" {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
merged[k] = v
|
||||||
|
}
|
||||||
|
|
||||||
|
keys := make([]string, 0, len(merged))
|
||||||
|
for k := range merged {
|
||||||
|
keys = append(keys, k)
|
||||||
|
}
|
||||||
|
sort.Strings(keys)
|
||||||
|
|
||||||
|
out := make([]string, 0, len(keys))
|
||||||
|
for _, k := range keys {
|
||||||
|
out = append(out, k+"="+merged[k])
|
||||||
|
}
|
||||||
|
r.cmd.Env = out
|
||||||
|
}
|
||||||
|
|
||||||
func (r *realCmd) Process() processHandle {
|
func (r *realCmd) Process() processHandle {
|
||||||
if r == nil || r.cmd == nil || r.cmd.Process == nil {
|
if r == nil || r.cmd == nil || r.cmd.Process == nil {
|
||||||
return nil
|
return nil
|
||||||
@@ -507,23 +556,43 @@ func generateFinalOutput(results []TaskResult) string {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func buildCodexArgs(cfg *Config, targetArg string) []string {
|
func buildCodexArgs(cfg *Config, targetArg string) []string {
|
||||||
if cfg.Mode == "resume" {
|
if cfg == nil {
|
||||||
return []string{
|
panic("buildCodexArgs: nil config")
|
||||||
"e",
|
}
|
||||||
"--skip-git-repo-check",
|
|
||||||
"--json",
|
var resumeSessionID string
|
||||||
"resume",
|
isResume := cfg.Mode == "resume"
|
||||||
cfg.SessionID,
|
if isResume {
|
||||||
targetArg,
|
resumeSessionID = strings.TrimSpace(cfg.SessionID)
|
||||||
|
if resumeSessionID == "" {
|
||||||
|
logError("invalid config: resume mode requires non-empty session_id")
|
||||||
|
isResume = false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return []string{
|
|
||||||
"e",
|
args := []string{"e"}
|
||||||
"--skip-git-repo-check",
|
|
||||||
|
if envFlagEnabled("CODEX_BYPASS_SANDBOX") {
|
||||||
|
logWarn("CODEX_BYPASS_SANDBOX=true: running without approval/sandbox protection")
|
||||||
|
args = append(args, "--dangerously-bypass-approvals-and-sandbox")
|
||||||
|
}
|
||||||
|
|
||||||
|
args = append(args, "--skip-git-repo-check")
|
||||||
|
|
||||||
|
if isResume {
|
||||||
|
return append(args,
|
||||||
|
"--json",
|
||||||
|
"resume",
|
||||||
|
resumeSessionID,
|
||||||
|
targetArg,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return append(args,
|
||||||
"-C", cfg.WorkDir,
|
"-C", cfg.WorkDir,
|
||||||
"--json",
|
"--json",
|
||||||
targetArg,
|
targetArg,
|
||||||
}
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
func runCodexTask(taskSpec TaskSpec, silent bool, timeoutSec int) TaskResult {
|
func runCodexTask(taskSpec TaskSpec, silent bool, timeoutSec int) TaskResult {
|
||||||
@@ -574,6 +643,12 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
cfg.WorkDir = defaultWorkdir
|
cfg.WorkDir = defaultWorkdir
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if cfg.Mode == "resume" && strings.TrimSpace(cfg.SessionID) == "" {
|
||||||
|
result.ExitCode = 1
|
||||||
|
result.Error = "resume mode requires non-empty session_id"
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
|
||||||
useStdin := taskSpec.UseStdin
|
useStdin := taskSpec.UseStdin
|
||||||
targetArg := taskSpec.Task
|
targetArg := taskSpec.Task
|
||||||
if useStdin {
|
if useStdin {
|
||||||
@@ -673,6 +748,12 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
|
|
||||||
cmd := newCommandRunner(ctx, commandName, codexArgs...)
|
cmd := newCommandRunner(ctx, commandName, codexArgs...)
|
||||||
|
|
||||||
|
if cfg.Backend == "claude" {
|
||||||
|
if env := loadMinimalEnvSettings(); len(env) > 0 {
|
||||||
|
cmd.SetEnv(env)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// For backends that don't support -C flag (claude, gemini), set working directory via cmd.Dir
|
// For backends that don't support -C flag (claude, gemini), set working directory via cmd.Dir
|
||||||
// Codex passes workdir via -C flag, so we skip setting Dir for it to avoid conflicts
|
// Codex passes workdir via -C flag, so we skip setting Dir for it to avoid conflicts
|
||||||
if cfg.Mode != "resume" && commandName != "codex" && cfg.WorkDir != "" {
|
if cfg.Mode != "resume" && commandName != "codex" && cfg.WorkDir != "" {
|
||||||
@@ -683,8 +764,17 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
if stderrLogger != nil {
|
if stderrLogger != nil {
|
||||||
stderrWriters = append(stderrWriters, stderrLogger)
|
stderrWriters = append(stderrWriters, stderrLogger)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// For gemini backend, filter noisy stderr output
|
||||||
|
var stderrFilter *filteringWriter
|
||||||
if !silent {
|
if !silent {
|
||||||
stderrWriters = append([]io.Writer{os.Stderr}, stderrWriters...)
|
stderrOut := io.Writer(os.Stderr)
|
||||||
|
if cfg.Backend == "gemini" {
|
||||||
|
stderrFilter = newFilteringWriter(os.Stderr, geminiNoisePatterns)
|
||||||
|
stderrOut = stderrFilter
|
||||||
|
defer stderrFilter.Flush()
|
||||||
|
}
|
||||||
|
stderrWriters = append([]io.Writer{stderrOut}, stderrWriters...)
|
||||||
}
|
}
|
||||||
if len(stderrWriters) == 1 {
|
if len(stderrWriters) == 1 {
|
||||||
cmd.SetStderr(stderrWriters[0])
|
cmd.SetStderr(stderrWriters[0])
|
||||||
@@ -720,6 +810,7 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
// Start parse goroutine BEFORE starting the command to avoid race condition
|
// Start parse goroutine BEFORE starting the command to avoid race condition
|
||||||
// where fast-completing commands close stdout before parser starts reading
|
// where fast-completing commands close stdout before parser starts reading
|
||||||
messageSeen := make(chan struct{}, 1)
|
messageSeen := make(chan struct{}, 1)
|
||||||
|
completeSeen := make(chan struct{}, 1)
|
||||||
parseCh := make(chan parseResult, 1)
|
parseCh := make(chan parseResult, 1)
|
||||||
go func() {
|
go func() {
|
||||||
msg, tid := parseJSONStreamInternal(stdoutReader, logWarnFn, logInfoFn, func() {
|
msg, tid := parseJSONStreamInternal(stdoutReader, logWarnFn, logInfoFn, func() {
|
||||||
@@ -727,7 +818,16 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
case messageSeen <- struct{}{}:
|
case messageSeen <- struct{}{}:
|
||||||
default:
|
default:
|
||||||
}
|
}
|
||||||
|
}, func() {
|
||||||
|
select {
|
||||||
|
case completeSeen <- struct{}{}:
|
||||||
|
default:
|
||||||
|
}
|
||||||
})
|
})
|
||||||
|
select {
|
||||||
|
case completeSeen <- struct{}{}:
|
||||||
|
default:
|
||||||
|
}
|
||||||
parseCh <- parseResult{message: msg, threadID: tid}
|
parseCh <- parseResult{message: msg, threadID: tid}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
@@ -764,17 +864,63 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
waitCh := make(chan error, 1)
|
waitCh := make(chan error, 1)
|
||||||
go func() { waitCh <- cmd.Wait() }()
|
go func() { waitCh <- cmd.Wait() }()
|
||||||
|
|
||||||
var waitErr error
|
var (
|
||||||
var forceKillTimer *forceKillTimer
|
waitErr error
|
||||||
var ctxCancelled bool
|
forceKillTimer *forceKillTimer
|
||||||
|
ctxCancelled bool
|
||||||
|
messageTimer *time.Timer
|
||||||
|
messageTimerCh <-chan time.Time
|
||||||
|
forcedAfterComplete bool
|
||||||
|
terminated bool
|
||||||
|
messageSeenObserved bool
|
||||||
|
completeSeenObserved bool
|
||||||
|
)
|
||||||
|
|
||||||
select {
|
waitLoop:
|
||||||
case waitErr = <-waitCh:
|
for {
|
||||||
case <-ctx.Done():
|
select {
|
||||||
ctxCancelled = true
|
case waitErr = <-waitCh:
|
||||||
logErrorFn(cancelReason(commandName, ctx))
|
break waitLoop
|
||||||
forceKillTimer = terminateCommandFn(cmd)
|
case <-ctx.Done():
|
||||||
waitErr = <-waitCh
|
ctxCancelled = true
|
||||||
|
logErrorFn(cancelReason(commandName, ctx))
|
||||||
|
if !terminated {
|
||||||
|
if timer := terminateCommandFn(cmd); timer != nil {
|
||||||
|
forceKillTimer = timer
|
||||||
|
terminated = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
waitErr = <-waitCh
|
||||||
|
break waitLoop
|
||||||
|
case <-messageTimerCh:
|
||||||
|
forcedAfterComplete = true
|
||||||
|
messageTimerCh = nil
|
||||||
|
if !terminated {
|
||||||
|
logWarnFn(fmt.Sprintf("%s output parsed; terminating lingering backend", commandName))
|
||||||
|
if timer := terminateCommandFn(cmd); timer != nil {
|
||||||
|
forceKillTimer = timer
|
||||||
|
terminated = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case <-completeSeen:
|
||||||
|
completeSeenObserved = true
|
||||||
|
if messageTimer != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
messageTimer = time.NewTimer(postMessageTerminateDelay)
|
||||||
|
messageTimerCh = messageTimer.C
|
||||||
|
case <-messageSeen:
|
||||||
|
messageSeenObserved = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if messageTimer != nil {
|
||||||
|
if !messageTimer.Stop() {
|
||||||
|
select {
|
||||||
|
case <-messageTimer.C:
|
||||||
|
default:
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if forceKillTimer != nil {
|
if forceKillTimer != nil {
|
||||||
@@ -782,10 +928,14 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
}
|
}
|
||||||
|
|
||||||
var parsed parseResult
|
var parsed parseResult
|
||||||
if ctxCancelled {
|
switch {
|
||||||
|
case ctxCancelled:
|
||||||
closeWithReason(stdout, stdoutCloseReasonCtx)
|
closeWithReason(stdout, stdoutCloseReasonCtx)
|
||||||
parsed = <-parseCh
|
parsed = <-parseCh
|
||||||
} else {
|
case messageSeenObserved || completeSeenObserved:
|
||||||
|
closeWithReason(stdout, stdoutCloseReasonWait)
|
||||||
|
parsed = <-parseCh
|
||||||
|
default:
|
||||||
drainTimer := time.NewTimer(stdoutDrainTimeout)
|
drainTimer := time.NewTimer(stdoutDrainTimeout)
|
||||||
defer drainTimer.Stop()
|
defer drainTimer.Stop()
|
||||||
|
|
||||||
@@ -793,6 +943,11 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
case parsed = <-parseCh:
|
case parsed = <-parseCh:
|
||||||
closeWithReason(stdout, stdoutCloseReasonWait)
|
closeWithReason(stdout, stdoutCloseReasonWait)
|
||||||
case <-messageSeen:
|
case <-messageSeen:
|
||||||
|
messageSeenObserved = true
|
||||||
|
closeWithReason(stdout, stdoutCloseReasonWait)
|
||||||
|
parsed = <-parseCh
|
||||||
|
case <-completeSeen:
|
||||||
|
completeSeenObserved = true
|
||||||
closeWithReason(stdout, stdoutCloseReasonWait)
|
closeWithReason(stdout, stdoutCloseReasonWait)
|
||||||
parsed = <-parseCh
|
parsed = <-parseCh
|
||||||
case <-drainTimer.C:
|
case <-drainTimer.C:
|
||||||
@@ -813,17 +968,21 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
}
|
}
|
||||||
|
|
||||||
if waitErr != nil {
|
if waitErr != nil {
|
||||||
if exitErr, ok := waitErr.(*exec.ExitError); ok {
|
if forcedAfterComplete && parsed.message != "" {
|
||||||
code := exitErr.ExitCode()
|
logWarnFn(fmt.Sprintf("%s terminated after delivering output", commandName))
|
||||||
logErrorFn(fmt.Sprintf("%s exited with status %d", commandName, code))
|
} else {
|
||||||
result.ExitCode = code
|
if exitErr, ok := waitErr.(*exec.ExitError); ok {
|
||||||
result.Error = attachStderr(fmt.Sprintf("%s exited with status %d", commandName, code))
|
code := exitErr.ExitCode()
|
||||||
|
logErrorFn(fmt.Sprintf("%s exited with status %d", commandName, code))
|
||||||
|
result.ExitCode = code
|
||||||
|
result.Error = attachStderr(fmt.Sprintf("%s exited with status %d", commandName, code))
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
logErrorFn(commandName + " error: " + waitErr.Error())
|
||||||
|
result.ExitCode = 1
|
||||||
|
result.Error = attachStderr(commandName + " error: " + waitErr.Error())
|
||||||
return result
|
return result
|
||||||
}
|
}
|
||||||
logErrorFn(commandName + " error: " + waitErr.Error())
|
|
||||||
result.ExitCode = 1
|
|
||||||
result.Error = attachStderr(commandName + " error: " + waitErr.Error())
|
|
||||||
return result
|
|
||||||
}
|
}
|
||||||
|
|
||||||
message := parsed.message
|
message := parsed.message
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import (
|
|||||||
"os"
|
"os"
|
||||||
"os/exec"
|
"os/exec"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"slices"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"sync/atomic"
|
"sync/atomic"
|
||||||
@@ -86,6 +87,7 @@ type execFakeRunner struct {
|
|||||||
process processHandle
|
process processHandle
|
||||||
stdin io.WriteCloser
|
stdin io.WriteCloser
|
||||||
dir string
|
dir string
|
||||||
|
env map[string]string
|
||||||
waitErr error
|
waitErr error
|
||||||
waitDelay time.Duration
|
waitDelay time.Duration
|
||||||
startErr error
|
startErr error
|
||||||
@@ -128,6 +130,17 @@ func (f *execFakeRunner) StdinPipe() (io.WriteCloser, error) {
|
|||||||
}
|
}
|
||||||
func (f *execFakeRunner) SetStderr(io.Writer) {}
|
func (f *execFakeRunner) SetStderr(io.Writer) {}
|
||||||
func (f *execFakeRunner) SetDir(dir string) { f.dir = dir }
|
func (f *execFakeRunner) SetDir(dir string) { f.dir = dir }
|
||||||
|
func (f *execFakeRunner) SetEnv(env map[string]string) {
|
||||||
|
if len(env) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if f.env == nil {
|
||||||
|
f.env = make(map[string]string, len(env))
|
||||||
|
}
|
||||||
|
for k, v := range env {
|
||||||
|
f.env[k] = v
|
||||||
|
}
|
||||||
|
}
|
||||||
func (f *execFakeRunner) Process() processHandle {
|
func (f *execFakeRunner) Process() processHandle {
|
||||||
if f.process != nil {
|
if f.process != nil {
|
||||||
return f.process
|
return f.process
|
||||||
@@ -244,6 +257,10 @@ func TestExecutorHelperCoverage(t *testing.T) {
|
|||||||
})
|
})
|
||||||
|
|
||||||
t.Run("generateFinalOutputAndArgs", func(t *testing.T) {
|
t.Run("generateFinalOutputAndArgs", func(t *testing.T) {
|
||||||
|
const key = "CODEX_BYPASS_SANDBOX"
|
||||||
|
t.Cleanup(func() { os.Unsetenv(key) })
|
||||||
|
os.Unsetenv(key)
|
||||||
|
|
||||||
out := generateFinalOutput([]TaskResult{
|
out := generateFinalOutput([]TaskResult{
|
||||||
{TaskID: "ok", ExitCode: 0},
|
{TaskID: "ok", ExitCode: 0},
|
||||||
{TaskID: "fail", ExitCode: 1, Error: "boom"},
|
{TaskID: "fail", ExitCode: 1, Error: "boom"},
|
||||||
@@ -257,11 +274,11 @@ func TestExecutorHelperCoverage(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
args := buildCodexArgs(&Config{Mode: "new", WorkDir: "/tmp"}, "task")
|
args := buildCodexArgs(&Config{Mode: "new", WorkDir: "/tmp"}, "task")
|
||||||
if len(args) == 0 || args[3] != "/tmp" {
|
if !slices.Equal(args, []string{"e", "--skip-git-repo-check", "-C", "/tmp", "--json", "task"}) {
|
||||||
t.Fatalf("unexpected codex args: %+v", args)
|
t.Fatalf("unexpected codex args: %+v", args)
|
||||||
}
|
}
|
||||||
args = buildCodexArgs(&Config{Mode: "resume", SessionID: "sess"}, "target")
|
args = buildCodexArgs(&Config{Mode: "resume", SessionID: "sess"}, "target")
|
||||||
if args[3] != "resume" || args[4] != "sess" {
|
if !slices.Equal(args, []string{"e", "--skip-git-repo-check", "--json", "resume", "sess", "target"}) {
|
||||||
t.Fatalf("unexpected resume args: %+v", args)
|
t.Fatalf("unexpected resume args: %+v", args)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@@ -298,6 +315,18 @@ func TestExecutorRunCodexTaskWithContext(t *testing.T) {
|
|||||||
origRunner := newCommandRunner
|
origRunner := newCommandRunner
|
||||||
defer func() { newCommandRunner = origRunner }()
|
defer func() { newCommandRunner = origRunner }()
|
||||||
|
|
||||||
|
t.Run("resumeMissingSessionID", func(t *testing.T) {
|
||||||
|
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
|
||||||
|
t.Fatalf("unexpected command execution for invalid resume config")
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
res := runCodexTaskWithContext(context.Background(), TaskSpec{Task: "payload", WorkDir: ".", Mode: "resume"}, nil, nil, false, false, 1)
|
||||||
|
if res.ExitCode == 0 || !strings.Contains(res.Error, "session_id") {
|
||||||
|
t.Fatalf("expected validation error, got %+v", res)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
t.Run("success", func(t *testing.T) {
|
t.Run("success", func(t *testing.T) {
|
||||||
var firstStdout *reasonReadCloser
|
var firstStdout *reasonReadCloser
|
||||||
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
|
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
|
||||||
|
|||||||
66
codeagent-wrapper/filter.go
Normal file
66
codeagent-wrapper/filter.go
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// geminiNoisePatterns contains stderr patterns to filter for gemini backend
|
||||||
|
var geminiNoisePatterns = []string{
|
||||||
|
"[STARTUP]",
|
||||||
|
"Session cleanup disabled",
|
||||||
|
"Warning:",
|
||||||
|
"(node:",
|
||||||
|
"(Use `node --trace-warnings",
|
||||||
|
"Loaded cached credentials",
|
||||||
|
"Loading extension:",
|
||||||
|
"YOLO mode is enabled",
|
||||||
|
}
|
||||||
|
|
||||||
|
// filteringWriter wraps an io.Writer and filters out lines matching patterns
|
||||||
|
type filteringWriter struct {
|
||||||
|
w io.Writer
|
||||||
|
patterns []string
|
||||||
|
buf bytes.Buffer
|
||||||
|
}
|
||||||
|
|
||||||
|
func newFilteringWriter(w io.Writer, patterns []string) *filteringWriter {
|
||||||
|
return &filteringWriter{w: w, patterns: patterns}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *filteringWriter) Write(p []byte) (n int, err error) {
|
||||||
|
f.buf.Write(p)
|
||||||
|
for {
|
||||||
|
line, err := f.buf.ReadString('\n')
|
||||||
|
if err != nil {
|
||||||
|
// incomplete line, put it back
|
||||||
|
f.buf.WriteString(line)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if !f.shouldFilter(line) {
|
||||||
|
f.w.Write([]byte(line))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return len(p), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *filteringWriter) shouldFilter(line string) bool {
|
||||||
|
for _, pattern := range f.patterns {
|
||||||
|
if strings.Contains(line, pattern) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Flush writes any remaining buffered content
|
||||||
|
func (f *filteringWriter) Flush() {
|
||||||
|
if f.buf.Len() > 0 {
|
||||||
|
remaining := f.buf.String()
|
||||||
|
if !f.shouldFilter(remaining) {
|
||||||
|
f.w.Write([]byte(remaining))
|
||||||
|
}
|
||||||
|
f.buf.Reset()
|
||||||
|
}
|
||||||
|
}
|
||||||
73
codeagent-wrapper/filter_test.go
Normal file
73
codeagent-wrapper/filter_test.go
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestFilteringWriter(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
patterns []string
|
||||||
|
input string
|
||||||
|
want string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "filter STARTUP lines",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "[STARTUP] Recording metric\nHello World\n[STARTUP] Another line\n",
|
||||||
|
want: "Hello World\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "filter Warning lines",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "Warning: something bad\nActual output\n",
|
||||||
|
want: "Actual output\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "filter multiple patterns",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "YOLO mode is enabled\nSession cleanup disabled\nReal content\nLoading extension: foo\n",
|
||||||
|
want: "Real content\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "no filtering needed",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "Line 1\nLine 2\nLine 3\n",
|
||||||
|
want: "Line 1\nLine 2\nLine 3\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "empty input",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "",
|
||||||
|
want: "",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
fw := newFilteringWriter(&buf, tt.patterns)
|
||||||
|
fw.Write([]byte(tt.input))
|
||||||
|
fw.Flush()
|
||||||
|
|
||||||
|
if got := buf.String(); got != tt.want {
|
||||||
|
t.Errorf("got %q, want %q", got, tt.want)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestFilteringWriterPartialLines(t *testing.T) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
fw := newFilteringWriter(&buf, geminiNoisePatterns)
|
||||||
|
|
||||||
|
// Write partial line
|
||||||
|
fw.Write([]byte("Hello "))
|
||||||
|
fw.Write([]byte("World\n"))
|
||||||
|
fw.Flush()
|
||||||
|
|
||||||
|
if got := buf.String(); got != "Hello World\n" {
|
||||||
|
t.Errorf("got %q, want %q", got, "Hello World\n")
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -366,7 +366,8 @@ func (l *Logger) run() {
|
|||||||
defer ticker.Stop()
|
defer ticker.Stop()
|
||||||
|
|
||||||
writeEntry := func(entry logEntry) {
|
writeEntry := func(entry logEntry) {
|
||||||
fmt.Fprintf(l.writer, "%s\n", entry.msg)
|
timestamp := time.Now().Format("2006-01-02 15:04:05.000")
|
||||||
|
fmt.Fprintf(l.writer, "[%s] %s\n", timestamp, entry.msg)
|
||||||
|
|
||||||
// Cache error/warn entries in memory for fast extraction
|
// Cache error/warn entries in memory for fast extraction
|
||||||
if entry.isError {
|
if entry.isError {
|
||||||
|
|||||||
@@ -14,9 +14,9 @@ import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
const (
|
const (
|
||||||
version = "5.2.5"
|
version = "5.2.7"
|
||||||
defaultWorkdir = "."
|
defaultWorkdir = "."
|
||||||
defaultTimeout = 7200 // seconds
|
defaultTimeout = 7200 // seconds (2 hours)
|
||||||
codexLogLineLimit = 1000
|
codexLogLineLimit = 1000
|
||||||
stdinSpecialChars = "\n\\\"'`$"
|
stdinSpecialChars = "\n\\\"'`$"
|
||||||
stderrCaptureLimit = 4 * 1024
|
stderrCaptureLimit = 4 * 1024
|
||||||
|
|||||||
@@ -255,6 +255,10 @@ func (d *drainBlockingCmd) SetDir(dir string) {
|
|||||||
d.inner.SetDir(dir)
|
d.inner.SetDir(dir)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (d *drainBlockingCmd) SetEnv(env map[string]string) {
|
||||||
|
d.inner.SetEnv(env)
|
||||||
|
}
|
||||||
|
|
||||||
func (d *drainBlockingCmd) Process() processHandle {
|
func (d *drainBlockingCmd) Process() processHandle {
|
||||||
return d.inner.Process()
|
return d.inner.Process()
|
||||||
}
|
}
|
||||||
@@ -387,6 +391,8 @@ type fakeCmd struct {
|
|||||||
|
|
||||||
stderr io.Writer
|
stderr io.Writer
|
||||||
|
|
||||||
|
env map[string]string
|
||||||
|
|
||||||
waitDelay time.Duration
|
waitDelay time.Duration
|
||||||
waitErr error
|
waitErr error
|
||||||
startErr error
|
startErr error
|
||||||
@@ -511,6 +517,20 @@ func (f *fakeCmd) SetStderr(w io.Writer) {
|
|||||||
|
|
||||||
func (f *fakeCmd) SetDir(string) {}
|
func (f *fakeCmd) SetDir(string) {}
|
||||||
|
|
||||||
|
func (f *fakeCmd) SetEnv(env map[string]string) {
|
||||||
|
if len(env) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
f.mu.Lock()
|
||||||
|
defer f.mu.Unlock()
|
||||||
|
if f.env == nil {
|
||||||
|
f.env = make(map[string]string, len(env))
|
||||||
|
}
|
||||||
|
for k, v := range env {
|
||||||
|
f.env[k] = v
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func (f *fakeCmd) Process() processHandle {
|
func (f *fakeCmd) Process() processHandle {
|
||||||
if f == nil {
|
if f == nil {
|
||||||
return nil
|
return nil
|
||||||
@@ -879,6 +899,79 @@ func TestRunCodexTask_ContextTimeout(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestRunCodexTask_ForcesStopAfterCompletion(t *testing.T) {
|
||||||
|
defer resetTestHooks()
|
||||||
|
forceKillDelay.Store(0)
|
||||||
|
|
||||||
|
fake := newFakeCmd(fakeCmdConfig{
|
||||||
|
StdoutPlan: []fakeStdoutEvent{
|
||||||
|
{Data: `{"type":"item.completed","item":{"type":"agent_message","text":"done"}}` + "\n"},
|
||||||
|
{Data: `{"type":"thread.completed","thread_id":"tid"}` + "\n"},
|
||||||
|
},
|
||||||
|
KeepStdoutOpen: true,
|
||||||
|
BlockWait: true,
|
||||||
|
ReleaseWaitOnSignal: true,
|
||||||
|
ReleaseWaitOnKill: true,
|
||||||
|
})
|
||||||
|
|
||||||
|
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
|
||||||
|
return fake
|
||||||
|
}
|
||||||
|
buildCodexArgsFn = func(cfg *Config, targetArg string) []string { return []string{targetArg} }
|
||||||
|
codexCommand = "fake-cmd"
|
||||||
|
|
||||||
|
start := time.Now()
|
||||||
|
result := runCodexTaskWithContext(context.Background(), TaskSpec{Task: "done", WorkDir: defaultWorkdir}, nil, nil, false, false, 60)
|
||||||
|
duration := time.Since(start)
|
||||||
|
|
||||||
|
if result.ExitCode != 0 || result.Message != "done" {
|
||||||
|
t.Fatalf("unexpected result: %+v", result)
|
||||||
|
}
|
||||||
|
if duration > 2*time.Second {
|
||||||
|
t.Fatalf("runCodexTaskWithContext took too long: %v", duration)
|
||||||
|
}
|
||||||
|
if fake.process.SignalCount() == 0 {
|
||||||
|
t.Fatalf("expected SIGTERM to be sent, got %d", fake.process.SignalCount())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRunCodexTask_DoesNotTerminateBeforeThreadCompleted(t *testing.T) {
|
||||||
|
defer resetTestHooks()
|
||||||
|
forceKillDelay.Store(0)
|
||||||
|
|
||||||
|
fake := newFakeCmd(fakeCmdConfig{
|
||||||
|
StdoutPlan: []fakeStdoutEvent{
|
||||||
|
{Data: `{"type":"item.completed","item":{"type":"agent_message","text":"intermediate"}}` + "\n"},
|
||||||
|
{Delay: 1100 * time.Millisecond, Data: `{"type":"item.completed","item":{"type":"agent_message","text":"final"}}` + "\n"},
|
||||||
|
{Data: `{"type":"thread.completed","thread_id":"tid"}` + "\n"},
|
||||||
|
},
|
||||||
|
KeepStdoutOpen: true,
|
||||||
|
BlockWait: true,
|
||||||
|
ReleaseWaitOnSignal: true,
|
||||||
|
ReleaseWaitOnKill: true,
|
||||||
|
})
|
||||||
|
|
||||||
|
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
|
||||||
|
return fake
|
||||||
|
}
|
||||||
|
buildCodexArgsFn = func(cfg *Config, targetArg string) []string { return []string{targetArg} }
|
||||||
|
codexCommand = "fake-cmd"
|
||||||
|
|
||||||
|
start := time.Now()
|
||||||
|
result := runCodexTaskWithContext(context.Background(), TaskSpec{Task: "done", WorkDir: defaultWorkdir}, nil, nil, false, false, 60)
|
||||||
|
duration := time.Since(start)
|
||||||
|
|
||||||
|
if result.ExitCode != 0 || result.Message != "final" {
|
||||||
|
t.Fatalf("unexpected result: %+v", result)
|
||||||
|
}
|
||||||
|
if duration > 5*time.Second {
|
||||||
|
t.Fatalf("runCodexTaskWithContext took too long: %v", duration)
|
||||||
|
}
|
||||||
|
if fake.process.SignalCount() == 0 {
|
||||||
|
t.Fatalf("expected SIGTERM to be sent, got %d", fake.process.SignalCount())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendParseArgs_NewMode(t *testing.T) {
|
func TestBackendParseArgs_NewMode(t *testing.T) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
@@ -965,6 +1058,8 @@ func TestBackendParseArgs_ResumeMode(t *testing.T) {
|
|||||||
},
|
},
|
||||||
{name: "resume missing session_id", args: []string{"codeagent-wrapper", "resume"}, wantErr: true},
|
{name: "resume missing session_id", args: []string{"codeagent-wrapper", "resume"}, wantErr: true},
|
||||||
{name: "resume missing task", args: []string{"codeagent-wrapper", "resume", "session-123"}, wantErr: true},
|
{name: "resume missing task", args: []string{"codeagent-wrapper", "resume", "session-123"}, wantErr: true},
|
||||||
|
{name: "resume empty session_id", args: []string{"codeagent-wrapper", "resume", "", "task"}, wantErr: true},
|
||||||
|
{name: "resume whitespace session_id", args: []string{"codeagent-wrapper", "resume", " ", "task"}, wantErr: true},
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, tt := range tests {
|
for _, tt := range tests {
|
||||||
@@ -1181,6 +1276,18 @@ do something`
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestParallelParseConfig_EmptySessionID(t *testing.T) {
|
||||||
|
input := `---TASK---
|
||||||
|
id: task-1
|
||||||
|
session_id:
|
||||||
|
---CONTENT---
|
||||||
|
do something`
|
||||||
|
|
||||||
|
if _, err := parseParallelConfig([]byte(input)); err == nil {
|
||||||
|
t.Fatalf("expected error for empty session_id, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestParallelParseConfig_InvalidFormat(t *testing.T) {
|
func TestParallelParseConfig_InvalidFormat(t *testing.T) {
|
||||||
if _, err := parseParallelConfig([]byte("invalid format")); err == nil {
|
if _, err := parseParallelConfig([]byte("invalid format")); err == nil {
|
||||||
t.Fatalf("expected error for invalid format, got nil")
|
t.Fatalf("expected error for invalid format, got nil")
|
||||||
@@ -1281,9 +1388,19 @@ func TestRunShouldUseStdin(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestRunBuildCodexArgs_NewMode(t *testing.T) {
|
func TestRunBuildCodexArgs_NewMode(t *testing.T) {
|
||||||
|
const key = "CODEX_BYPASS_SANDBOX"
|
||||||
|
t.Cleanup(func() { os.Unsetenv(key) })
|
||||||
|
os.Unsetenv(key)
|
||||||
|
|
||||||
cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
|
cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
|
||||||
args := buildCodexArgs(cfg, "my task")
|
args := buildCodexArgs(cfg, "my task")
|
||||||
expected := []string{"e", "--skip-git-repo-check", "-C", "/test/dir", "--json", "my task"}
|
expected := []string{
|
||||||
|
"e",
|
||||||
|
"--skip-git-repo-check",
|
||||||
|
"-C", "/test/dir",
|
||||||
|
"--json",
|
||||||
|
"my task",
|
||||||
|
}
|
||||||
if len(args) != len(expected) {
|
if len(args) != len(expected) {
|
||||||
t.Fatalf("len mismatch")
|
t.Fatalf("len mismatch")
|
||||||
}
|
}
|
||||||
@@ -1295,9 +1412,20 @@ func TestRunBuildCodexArgs_NewMode(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestRunBuildCodexArgs_ResumeMode(t *testing.T) {
|
func TestRunBuildCodexArgs_ResumeMode(t *testing.T) {
|
||||||
|
const key = "CODEX_BYPASS_SANDBOX"
|
||||||
|
t.Cleanup(func() { os.Unsetenv(key) })
|
||||||
|
os.Unsetenv(key)
|
||||||
|
|
||||||
cfg := &Config{Mode: "resume", SessionID: "session-abc"}
|
cfg := &Config{Mode: "resume", SessionID: "session-abc"}
|
||||||
args := buildCodexArgs(cfg, "-")
|
args := buildCodexArgs(cfg, "-")
|
||||||
expected := []string{"e", "--skip-git-repo-check", "--json", "resume", "session-abc", "-"}
|
expected := []string{
|
||||||
|
"e",
|
||||||
|
"--skip-git-repo-check",
|
||||||
|
"--json",
|
||||||
|
"resume",
|
||||||
|
"session-abc",
|
||||||
|
"-",
|
||||||
|
}
|
||||||
if len(args) != len(expected) {
|
if len(args) != len(expected) {
|
||||||
t.Fatalf("len mismatch")
|
t.Fatalf("len mismatch")
|
||||||
}
|
}
|
||||||
@@ -1308,6 +1436,61 @@ func TestRunBuildCodexArgs_ResumeMode(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestRunBuildCodexArgs_ResumeMode_EmptySessionHandledGracefully(t *testing.T) {
|
||||||
|
const key = "CODEX_BYPASS_SANDBOX"
|
||||||
|
t.Cleanup(func() { os.Unsetenv(key) })
|
||||||
|
os.Unsetenv(key)
|
||||||
|
|
||||||
|
cfg := &Config{Mode: "resume", SessionID: " ", WorkDir: "/test/dir"}
|
||||||
|
args := buildCodexArgs(cfg, "task")
|
||||||
|
expected := []string{"e", "--skip-git-repo-check", "-C", "/test/dir", "--json", "task"}
|
||||||
|
if len(args) != len(expected) {
|
||||||
|
t.Fatalf("len mismatch")
|
||||||
|
}
|
||||||
|
for i := range args {
|
||||||
|
if args[i] != expected[i] {
|
||||||
|
t.Fatalf("args[%d]=%s, want %s", i, args[i], expected[i])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRunBuildCodexArgs_BypassSandboxEnvTrue(t *testing.T) {
|
||||||
|
defer resetTestHooks()
|
||||||
|
tempDir := t.TempDir()
|
||||||
|
t.Setenv("TMPDIR", tempDir)
|
||||||
|
|
||||||
|
logger, err := NewLogger()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("NewLogger() error = %v", err)
|
||||||
|
}
|
||||||
|
setLogger(logger)
|
||||||
|
defer closeLogger()
|
||||||
|
|
||||||
|
t.Setenv("CODEX_BYPASS_SANDBOX", "true")
|
||||||
|
|
||||||
|
cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
|
||||||
|
args := buildCodexArgs(cfg, "my task")
|
||||||
|
found := false
|
||||||
|
for _, arg := range args {
|
||||||
|
if arg == "--dangerously-bypass-approvals-and-sandbox" {
|
||||||
|
found = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !found {
|
||||||
|
t.Fatalf("expected bypass flag in args, got %v", args)
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.Flush()
|
||||||
|
data, err := os.ReadFile(logger.Path())
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("failed to read log file: %v", err)
|
||||||
|
}
|
||||||
|
if !strings.Contains(string(data), "CODEX_BYPASS_SANDBOX=true") {
|
||||||
|
t.Fatalf("expected bypass warning log, got: %s", string(data))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendSelectBackend(t *testing.T) {
|
func TestBackendSelectBackend(t *testing.T) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
@@ -1363,7 +1546,13 @@ func TestBackendBuildArgs_CodexBackend(t *testing.T) {
|
|||||||
backend := CodexBackend{}
|
backend := CodexBackend{}
|
||||||
cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
|
cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
|
||||||
got := backend.BuildArgs(cfg, "task")
|
got := backend.BuildArgs(cfg, "task")
|
||||||
want := []string{"e", "--skip-git-repo-check", "-C", "/test/dir", "--json", "task"}
|
want := []string{
|
||||||
|
"e",
|
||||||
|
"--skip-git-repo-check",
|
||||||
|
"-C", "/test/dir",
|
||||||
|
"--json",
|
||||||
|
"task",
|
||||||
|
}
|
||||||
if len(got) != len(want) {
|
if len(got) != len(want) {
|
||||||
t.Fatalf("length mismatch")
|
t.Fatalf("length mismatch")
|
||||||
}
|
}
|
||||||
@@ -1378,13 +1567,13 @@ func TestBackendBuildArgs_ClaudeBackend(t *testing.T) {
|
|||||||
backend := ClaudeBackend{}
|
backend := ClaudeBackend{}
|
||||||
cfg := &Config{Mode: "new", WorkDir: defaultWorkdir}
|
cfg := &Config{Mode: "new", WorkDir: defaultWorkdir}
|
||||||
got := backend.BuildArgs(cfg, "todo")
|
got := backend.BuildArgs(cfg, "todo")
|
||||||
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"}
|
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"}
|
||||||
if len(got) != len(want) {
|
if len(got) != len(want) {
|
||||||
t.Fatalf("length mismatch")
|
t.Fatalf("args length=%d, want %d: %v", len(got), len(want), got)
|
||||||
}
|
}
|
||||||
for i := range want {
|
for i := range want {
|
||||||
if got[i] != want[i] {
|
if got[i] != want[i] {
|
||||||
t.Fatalf("index %d got %s want %s", i, got[i], want[i])
|
t.Fatalf("index %d got %q want %q (args=%v)", i, got[i], want[i], got)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1399,19 +1588,15 @@ func TestClaudeBackendBuildArgs_OutputValidation(t *testing.T) {
|
|||||||
target := "ensure-flags"
|
target := "ensure-flags"
|
||||||
|
|
||||||
args := backend.BuildArgs(cfg, target)
|
args := backend.BuildArgs(cfg, target)
|
||||||
expectedPrefix := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose"}
|
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", target}
|
||||||
|
if len(args) != len(want) {
|
||||||
if len(args) != len(expectedPrefix)+1 {
|
t.Fatalf("args length=%d, want %d: %v", len(args), len(want), args)
|
||||||
t.Fatalf("args length=%d, want %d", len(args), len(expectedPrefix)+1)
|
|
||||||
}
|
}
|
||||||
for i, val := range expectedPrefix {
|
for i := range want {
|
||||||
if args[i] != val {
|
if args[i] != want[i] {
|
||||||
t.Fatalf("args[%d]=%q, want %q", i, args[i], val)
|
t.Fatalf("index %d got %q want %q (args=%v)", i, args[i], want[i], args)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if args[len(args)-1] != target {
|
|
||||||
t.Fatalf("last arg=%q, want target %q", args[len(args)-1], target)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestBackendBuildArgs_GeminiBackend(t *testing.T) {
|
func TestBackendBuildArgs_GeminiBackend(t *testing.T) {
|
||||||
@@ -1582,6 +1767,34 @@ func TestBackendParseJSONStream_ClaudeEvents(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_ClaudeEvents_ItemDoesNotForceCodex(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
input string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "null item",
|
||||||
|
input: `{"type":"result","result":"OK","session_id":"abc123","item":null}`,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "empty object item",
|
||||||
|
input: `{"type":"result","subtype":"x","result":"OK","session_id":"abc123","item":{}}`,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
message, threadID := parseJSONStream(strings.NewReader(tt.input))
|
||||||
|
if message != "OK" {
|
||||||
|
t.Fatalf("message=%q, want %q", message, "OK")
|
||||||
|
}
|
||||||
|
if threadID != "abc123" {
|
||||||
|
t.Fatalf("threadID=%q, want %q", threadID, "abc123")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendParseJSONStream_GeminiEvents(t *testing.T) {
|
func TestBackendParseJSONStream_GeminiEvents(t *testing.T) {
|
||||||
input := `{"type":"init","session_id":"xyz789"}
|
input := `{"type":"init","session_id":"xyz789"}
|
||||||
{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"xyz789"}
|
{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"xyz789"}
|
||||||
@@ -1598,6 +1811,43 @@ func TestBackendParseJSONStream_GeminiEvents(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_GeminiEvents_DeltaFalseStillDetected(t *testing.T) {
|
||||||
|
input := `{"type":"init","session_id":"xyz789"}
|
||||||
|
{"type":"message","content":"Hi","delta":false,"session_id":"xyz789"}
|
||||||
|
{"type":"result","status":"success","session_id":"xyz789"}`
|
||||||
|
|
||||||
|
message, threadID := parseJSONStream(strings.NewReader(input))
|
||||||
|
|
||||||
|
if message != "Hi" {
|
||||||
|
t.Fatalf("message=%q, want %q", message, "Hi")
|
||||||
|
}
|
||||||
|
if threadID != "xyz789" {
|
||||||
|
t.Fatalf("threadID=%q, want %q", threadID, "xyz789")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_GeminiEvents_OnMessageTriggeredOnStatus(t *testing.T) {
|
||||||
|
input := `{"type":"init","session_id":"xyz789"}
|
||||||
|
{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"xyz789"}
|
||||||
|
{"type":"message","content":" there","delta":true}
|
||||||
|
{"type":"result","status":"success","session_id":"xyz789"}`
|
||||||
|
|
||||||
|
var called int
|
||||||
|
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
|
||||||
|
called++
|
||||||
|
}, nil)
|
||||||
|
|
||||||
|
if message != "Hi there" {
|
||||||
|
t.Fatalf("message=%q, want %q", message, "Hi there")
|
||||||
|
}
|
||||||
|
if threadID != "xyz789" {
|
||||||
|
t.Fatalf("threadID=%q, want %q", threadID, "xyz789")
|
||||||
|
}
|
||||||
|
if called != 1 {
|
||||||
|
t.Fatalf("onMessage called=%d, want %d", called, 1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendParseJSONStreamWithWarn_InvalidLine(t *testing.T) {
|
func TestBackendParseJSONStreamWithWarn_InvalidLine(t *testing.T) {
|
||||||
var warnings []string
|
var warnings []string
|
||||||
warnFn := func(msg string) { warnings = append(warnings, msg) }
|
warnFn := func(msg string) { warnings = append(warnings, msg) }
|
||||||
@@ -1614,7 +1864,7 @@ func TestBackendParseJSONStream_OnMessage(t *testing.T) {
|
|||||||
var called int
|
var called int
|
||||||
message, threadID := parseJSONStreamInternal(strings.NewReader(`{"type":"item.completed","item":{"type":"agent_message","text":"hook"}}`), nil, nil, func() {
|
message, threadID := parseJSONStreamInternal(strings.NewReader(`{"type":"item.completed","item":{"type":"agent_message","text":"hook"}}`), nil, nil, func() {
|
||||||
called++
|
called++
|
||||||
})
|
}, nil)
|
||||||
if message != "hook" {
|
if message != "hook" {
|
||||||
t.Fatalf("message = %q, want hook", message)
|
t.Fatalf("message = %q, want hook", message)
|
||||||
}
|
}
|
||||||
@@ -1626,10 +1876,86 @@ func TestBackendParseJSONStream_OnMessage(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_OnComplete_CodexThreadCompleted(t *testing.T) {
|
||||||
|
input := `{"type":"item.completed","item":{"type":"agent_message","text":"first"}}` + "\n" +
|
||||||
|
`{"type":"item.completed","item":{"type":"agent_message","text":"second"}}` + "\n" +
|
||||||
|
`{"type":"thread.completed","thread_id":"t-1"}`
|
||||||
|
|
||||||
|
var onMessageCalls int
|
||||||
|
var onCompleteCalls int
|
||||||
|
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
|
||||||
|
onMessageCalls++
|
||||||
|
}, func() {
|
||||||
|
onCompleteCalls++
|
||||||
|
})
|
||||||
|
if message != "second" {
|
||||||
|
t.Fatalf("message = %q, want second", message)
|
||||||
|
}
|
||||||
|
if threadID != "t-1" {
|
||||||
|
t.Fatalf("threadID = %q, want t-1", threadID)
|
||||||
|
}
|
||||||
|
if onMessageCalls != 2 {
|
||||||
|
t.Fatalf("onMessage calls = %d, want 2", onMessageCalls)
|
||||||
|
}
|
||||||
|
if onCompleteCalls != 1 {
|
||||||
|
t.Fatalf("onComplete calls = %d, want 1", onCompleteCalls)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_OnComplete_ClaudeResult(t *testing.T) {
|
||||||
|
input := `{"type":"message","subtype":"stream","session_id":"s-1"}` + "\n" +
|
||||||
|
`{"type":"result","result":"OK","session_id":"s-1"}`
|
||||||
|
|
||||||
|
var onMessageCalls int
|
||||||
|
var onCompleteCalls int
|
||||||
|
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
|
||||||
|
onMessageCalls++
|
||||||
|
}, func() {
|
||||||
|
onCompleteCalls++
|
||||||
|
})
|
||||||
|
if message != "OK" {
|
||||||
|
t.Fatalf("message = %q, want OK", message)
|
||||||
|
}
|
||||||
|
if threadID != "s-1" {
|
||||||
|
t.Fatalf("threadID = %q, want s-1", threadID)
|
||||||
|
}
|
||||||
|
if onMessageCalls != 1 {
|
||||||
|
t.Fatalf("onMessage calls = %d, want 1", onMessageCalls)
|
||||||
|
}
|
||||||
|
if onCompleteCalls != 1 {
|
||||||
|
t.Fatalf("onComplete calls = %d, want 1", onCompleteCalls)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_OnComplete_GeminiTerminalResultStatus(t *testing.T) {
|
||||||
|
input := `{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"g-1"}` + "\n" +
|
||||||
|
`{"type":"result","status":"success","session_id":"g-1"}`
|
||||||
|
|
||||||
|
var onMessageCalls int
|
||||||
|
var onCompleteCalls int
|
||||||
|
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
|
||||||
|
onMessageCalls++
|
||||||
|
}, func() {
|
||||||
|
onCompleteCalls++
|
||||||
|
})
|
||||||
|
if message != "Hi" {
|
||||||
|
t.Fatalf("message = %q, want Hi", message)
|
||||||
|
}
|
||||||
|
if threadID != "g-1" {
|
||||||
|
t.Fatalf("threadID = %q, want g-1", threadID)
|
||||||
|
}
|
||||||
|
if onMessageCalls != 1 {
|
||||||
|
t.Fatalf("onMessage calls = %d, want 1", onMessageCalls)
|
||||||
|
}
|
||||||
|
if onCompleteCalls != 1 {
|
||||||
|
t.Fatalf("onComplete calls = %d, want 1", onCompleteCalls)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendParseJSONStream_ScannerError(t *testing.T) {
|
func TestBackendParseJSONStream_ScannerError(t *testing.T) {
|
||||||
var warnings []string
|
var warnings []string
|
||||||
warnFn := func(msg string) { warnings = append(warnings, msg) }
|
warnFn := func(msg string) { warnings = append(warnings, msg) }
|
||||||
message, threadID := parseJSONStreamInternal(errReader{err: errors.New("scan-fail")}, warnFn, nil, nil)
|
message, threadID := parseJSONStreamInternal(errReader{err: errors.New("scan-fail")}, warnFn, nil, nil, nil)
|
||||||
if message != "" || threadID != "" {
|
if message != "" || threadID != "" {
|
||||||
t.Fatalf("expected empty output on scanner error, got message=%q threadID=%q", message, threadID)
|
t.Fatalf("expected empty output on scanner error, got message=%q threadID=%q", message, threadID)
|
||||||
}
|
}
|
||||||
@@ -2691,7 +3017,7 @@ func TestVersionFlag(t *testing.T) {
|
|||||||
t.Errorf("exit = %d, want 0", code)
|
t.Errorf("exit = %d, want 0", code)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
want := "codeagent-wrapper version 5.2.5\n"
|
want := "codeagent-wrapper version 5.2.7\n"
|
||||||
if output != want {
|
if output != want {
|
||||||
t.Fatalf("output = %q, want %q", output, want)
|
t.Fatalf("output = %q, want %q", output, want)
|
||||||
}
|
}
|
||||||
@@ -2705,7 +3031,7 @@ func TestVersionShortFlag(t *testing.T) {
|
|||||||
t.Errorf("exit = %d, want 0", code)
|
t.Errorf("exit = %d, want 0", code)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
want := "codeagent-wrapper version 5.2.5\n"
|
want := "codeagent-wrapper version 5.2.7\n"
|
||||||
if output != want {
|
if output != want {
|
||||||
t.Fatalf("output = %q, want %q", output, want)
|
t.Fatalf("output = %q, want %q", output, want)
|
||||||
}
|
}
|
||||||
@@ -2719,7 +3045,7 @@ func TestVersionLegacyAlias(t *testing.T) {
|
|||||||
t.Errorf("exit = %d, want 0", code)
|
t.Errorf("exit = %d, want 0", code)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
want := "codex-wrapper version 5.2.5\n"
|
want := "codex-wrapper version 5.2.7\n"
|
||||||
if output != want {
|
if output != want {
|
||||||
t.Fatalf("output = %q, want %q", output, want)
|
t.Fatalf("output = %q, want %q", output, want)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -50,7 +50,7 @@ func parseJSONStreamWithWarn(r io.Reader, warnFn func(string)) (message, threadI
|
|||||||
}
|
}
|
||||||
|
|
||||||
func parseJSONStreamWithLog(r io.Reader, warnFn func(string), infoFn func(string)) (message, threadID string) {
|
func parseJSONStreamWithLog(r io.Reader, warnFn func(string), infoFn func(string)) (message, threadID string) {
|
||||||
return parseJSONStreamInternal(r, warnFn, infoFn, nil)
|
return parseJSONStreamInternal(r, warnFn, infoFn, nil, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
const (
|
const (
|
||||||
@@ -67,7 +67,35 @@ type codexHeader struct {
|
|||||||
} `json:"item,omitempty"`
|
} `json:"item,omitempty"`
|
||||||
}
|
}
|
||||||
|
|
||||||
func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(string), onMessage func()) (message, threadID string) {
|
// UnifiedEvent combines all backend event formats into a single structure
|
||||||
|
// to avoid multiple JSON unmarshal operations per event
|
||||||
|
type UnifiedEvent struct {
|
||||||
|
// Common fields
|
||||||
|
Type string `json:"type"`
|
||||||
|
|
||||||
|
// Codex-specific fields
|
||||||
|
ThreadID string `json:"thread_id,omitempty"`
|
||||||
|
Item json.RawMessage `json:"item,omitempty"` // Lazy parse
|
||||||
|
|
||||||
|
// Claude-specific fields
|
||||||
|
Subtype string `json:"subtype,omitempty"`
|
||||||
|
SessionID string `json:"session_id,omitempty"`
|
||||||
|
Result string `json:"result,omitempty"`
|
||||||
|
|
||||||
|
// Gemini-specific fields
|
||||||
|
Role string `json:"role,omitempty"`
|
||||||
|
Content string `json:"content,omitempty"`
|
||||||
|
Delta *bool `json:"delta,omitempty"`
|
||||||
|
Status string `json:"status,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ItemContent represents the parsed item.text field for Codex events
|
||||||
|
type ItemContent struct {
|
||||||
|
Type string `json:"type"`
|
||||||
|
Text interface{} `json:"text"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(string), onMessage func(), onComplete func()) (message, threadID string) {
|
||||||
reader := bufio.NewReaderSize(r, jsonLineReaderSize)
|
reader := bufio.NewReaderSize(r, jsonLineReaderSize)
|
||||||
|
|
||||||
if warnFn == nil {
|
if warnFn == nil {
|
||||||
@@ -83,6 +111,12 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
notifyComplete := func() {
|
||||||
|
if onComplete != nil {
|
||||||
|
onComplete()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
totalEvents := 0
|
totalEvents := 0
|
||||||
|
|
||||||
var (
|
var (
|
||||||
@@ -112,71 +146,87 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
|
|||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
var codex codexHeader
|
// Single unmarshal for all backend types
|
||||||
if err := json.Unmarshal(line, &codex); err == nil {
|
var event UnifiedEvent
|
||||||
isCodex := codex.ThreadID != "" || (codex.Item != nil && codex.Item.Type != "")
|
if err := json.Unmarshal(line, &event); err != nil {
|
||||||
if isCodex {
|
warnFn(fmt.Sprintf("Failed to parse event: %s", truncateBytes(line, 100)))
|
||||||
var details []string
|
continue
|
||||||
if codex.ThreadID != "" {
|
}
|
||||||
details = append(details, fmt.Sprintf("thread_id=%s", codex.ThreadID))
|
|
||||||
}
|
|
||||||
if codex.Item != nil && codex.Item.Type != "" {
|
|
||||||
details = append(details, fmt.Sprintf("item_type=%s", codex.Item.Type))
|
|
||||||
}
|
|
||||||
if len(details) > 0 {
|
|
||||||
infoFn(fmt.Sprintf("Parsed event #%d type=%s (%s)", totalEvents, codex.Type, strings.Join(details, ", ")))
|
|
||||||
} else {
|
|
||||||
infoFn(fmt.Sprintf("Parsed event #%d type=%s", totalEvents, codex.Type))
|
|
||||||
}
|
|
||||||
|
|
||||||
switch codex.Type {
|
// Detect backend type by field presence
|
||||||
case "thread.started":
|
isCodex := event.ThreadID != ""
|
||||||
threadID = codex.ThreadID
|
if !isCodex && len(event.Item) > 0 {
|
||||||
infoFn(fmt.Sprintf("thread.started event thread_id=%s", threadID))
|
var itemHeader struct {
|
||||||
case "item.completed":
|
Type string `json:"type"`
|
||||||
itemType := ""
|
}
|
||||||
if codex.Item != nil {
|
if json.Unmarshal(event.Item, &itemHeader) == nil && itemHeader.Type != "" {
|
||||||
itemType = codex.Item.Type
|
isCodex = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
isClaude := event.Subtype != "" || event.Result != ""
|
||||||
|
if !isClaude && event.Type == "result" && event.SessionID != "" && event.Status == "" {
|
||||||
|
isClaude = true
|
||||||
|
}
|
||||||
|
isGemini := event.Role != "" || event.Delta != nil || event.Status != ""
|
||||||
|
|
||||||
|
// Handle Codex events
|
||||||
|
if isCodex {
|
||||||
|
var details []string
|
||||||
|
if event.ThreadID != "" {
|
||||||
|
details = append(details, fmt.Sprintf("thread_id=%s", event.ThreadID))
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(details) > 0 {
|
||||||
|
infoFn(fmt.Sprintf("Parsed event #%d type=%s (%s)", totalEvents, event.Type, strings.Join(details, ", ")))
|
||||||
|
} else {
|
||||||
|
infoFn(fmt.Sprintf("Parsed event #%d type=%s", totalEvents, event.Type))
|
||||||
|
}
|
||||||
|
|
||||||
|
switch event.Type {
|
||||||
|
case "thread.started":
|
||||||
|
threadID = event.ThreadID
|
||||||
|
infoFn(fmt.Sprintf("thread.started event thread_id=%s", threadID))
|
||||||
|
|
||||||
|
case "thread.completed":
|
||||||
|
if event.ThreadID != "" && threadID == "" {
|
||||||
|
threadID = event.ThreadID
|
||||||
|
}
|
||||||
|
infoFn(fmt.Sprintf("thread.completed event thread_id=%s", event.ThreadID))
|
||||||
|
notifyComplete()
|
||||||
|
|
||||||
|
case "item.completed":
|
||||||
|
var itemType string
|
||||||
|
if len(event.Item) > 0 {
|
||||||
|
var itemHeader struct {
|
||||||
|
Type string `json:"type"`
|
||||||
}
|
}
|
||||||
|
if err := json.Unmarshal(event.Item, &itemHeader); err == nil {
|
||||||
|
itemType = itemHeader.Type
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if itemType == "agent_message" {
|
if itemType == "agent_message" && len(event.Item) > 0 {
|
||||||
var event JSONEvent
|
// Lazy parse: only parse item content when needed
|
||||||
if err := json.Unmarshal(line, &event); err != nil {
|
var item ItemContent
|
||||||
warnFn(fmt.Sprintf("Failed to parse Codex event: %s", truncateBytes(line, 100)))
|
if err := json.Unmarshal(event.Item, &item); err == nil {
|
||||||
continue
|
normalized := normalizeText(item.Text)
|
||||||
}
|
|
||||||
|
|
||||||
normalized := ""
|
|
||||||
if event.Item != nil {
|
|
||||||
normalized = normalizeText(event.Item.Text)
|
|
||||||
}
|
|
||||||
infoFn(fmt.Sprintf("item.completed event item_type=%s message_len=%d", itemType, len(normalized)))
|
infoFn(fmt.Sprintf("item.completed event item_type=%s message_len=%d", itemType, len(normalized)))
|
||||||
if normalized != "" {
|
if normalized != "" {
|
||||||
codexMessage = normalized
|
codexMessage = normalized
|
||||||
notifyMessage()
|
notifyMessage()
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
infoFn(fmt.Sprintf("item.completed event item_type=%s", itemType))
|
warnFn(fmt.Sprintf("Failed to parse item content: %s", err.Error()))
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
infoFn(fmt.Sprintf("item.completed event item_type=%s", itemType))
|
||||||
}
|
}
|
||||||
continue
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
var raw map[string]json.RawMessage
|
|
||||||
if err := json.Unmarshal(line, &raw); err != nil {
|
|
||||||
warnFn(fmt.Sprintf("Failed to parse line: %s", truncateBytes(line, 100)))
|
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
switch {
|
// Handle Claude events
|
||||||
case hasKey(raw, "subtype") || hasKey(raw, "result"):
|
if isClaude {
|
||||||
var event ClaudeEvent
|
|
||||||
if err := json.Unmarshal(line, &event); err != nil {
|
|
||||||
warnFn(fmt.Sprintf("Failed to parse Claude event: %s", truncateBytes(line, 100)))
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if event.SessionID != "" && threadID == "" {
|
if event.SessionID != "" && threadID == "" {
|
||||||
threadID = event.SessionID
|
threadID = event.SessionID
|
||||||
}
|
}
|
||||||
@@ -188,27 +238,41 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
|
|||||||
notifyMessage()
|
notifyMessage()
|
||||||
}
|
}
|
||||||
|
|
||||||
case hasKey(raw, "role") || hasKey(raw, "delta"):
|
if event.Type == "result" {
|
||||||
var event GeminiEvent
|
notifyComplete()
|
||||||
if err := json.Unmarshal(line, &event); err != nil {
|
|
||||||
warnFn(fmt.Sprintf("Failed to parse Gemini event: %s", truncateBytes(line, 100)))
|
|
||||||
continue
|
|
||||||
}
|
}
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle Gemini events
|
||||||
|
if isGemini {
|
||||||
if event.SessionID != "" && threadID == "" {
|
if event.SessionID != "" && threadID == "" {
|
||||||
threadID = event.SessionID
|
threadID = event.SessionID
|
||||||
}
|
}
|
||||||
|
|
||||||
if event.Content != "" {
|
if event.Content != "" {
|
||||||
geminiBuffer.WriteString(event.Content)
|
geminiBuffer.WriteString(event.Content)
|
||||||
notifyMessage()
|
|
||||||
}
|
}
|
||||||
|
|
||||||
infoFn(fmt.Sprintf("Parsed Gemini event #%d type=%s role=%s delta=%t status=%s content_len=%d", totalEvents, event.Type, event.Role, event.Delta, event.Status, len(event.Content)))
|
if event.Status != "" {
|
||||||
|
notifyMessage()
|
||||||
|
|
||||||
default:
|
if event.Type == "result" && (event.Status == "success" || event.Status == "error" || event.Status == "complete" || event.Status == "failed") {
|
||||||
warnFn(fmt.Sprintf("Unknown event format: %s", truncateBytes(line, 100)))
|
notifyComplete()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
delta := false
|
||||||
|
if event.Delta != nil {
|
||||||
|
delta = *event.Delta
|
||||||
|
}
|
||||||
|
|
||||||
|
infoFn(fmt.Sprintf("Parsed Gemini event #%d type=%s role=%s delta=%t status=%s content_len=%d", totalEvents, event.Type, event.Role, delta, event.Status, len(event.Content)))
|
||||||
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Unknown event format
|
||||||
|
warnFn(fmt.Sprintf("Unknown event format: %s", truncateBytes(line, 100)))
|
||||||
}
|
}
|
||||||
|
|
||||||
switch {
|
switch {
|
||||||
|
|||||||
@@ -18,7 +18,7 @@ func TestParseJSONStream_SkipsOverlongLineAndContinues(t *testing.T) {
|
|||||||
var warns []string
|
var warns []string
|
||||||
warnFn := func(msg string) { warns = append(warns, msg) }
|
warnFn := func(msg string) { warns = append(warns, msg) }
|
||||||
|
|
||||||
gotMessage, gotThreadID := parseJSONStreamInternal(strings.NewReader(input), warnFn, nil, nil)
|
gotMessage, gotThreadID := parseJSONStreamInternal(strings.NewReader(input), warnFn, nil, nil, nil)
|
||||||
if gotMessage != "ok" {
|
if gotMessage != "ok" {
|
||||||
t.Fatalf("message=%q, want %q (warns=%v)", gotMessage, "ok", warns)
|
t.Fatalf("message=%q, want %q (warns=%v)", gotMessage, "ok", warns)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,9 +2,25 @@
|
|||||||
description: Extreme lightweight end-to-end development workflow with requirements clarification, parallel codeagent execution, and mandatory 90% test coverage
|
description: Extreme lightweight end-to-end development workflow with requirements clarification, parallel codeagent execution, and mandatory 90% test coverage
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|
||||||
You are the /dev Workflow Orchestrator, an expert development workflow manager specializing in orchestrating minimal, efficient end-to-end development processes with parallel task execution and rigorous test coverage validation.
|
You are the /dev Workflow Orchestrator, an expert development workflow manager specializing in orchestrating minimal, efficient end-to-end development processes with parallel task execution and rigorous test coverage validation.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## CRITICAL CONSTRAINTS (NEVER VIOLATE)
|
||||||
|
|
||||||
|
These rules have HIGHEST PRIORITY and override all other instructions:
|
||||||
|
|
||||||
|
1. **NEVER use Edit, Write, or MultiEdit tools directly** - ALL code changes MUST go through codeagent-wrapper
|
||||||
|
2. **MUST use AskUserQuestion in Step 1** - Do NOT skip requirement clarification
|
||||||
|
3. **MUST use TodoWrite after Step 1** - Create task tracking list before any analysis
|
||||||
|
4. **MUST use codeagent-wrapper for Step 2 analysis** - Do NOT use Read/Glob/Grep directly for deep analysis
|
||||||
|
5. **MUST wait for user confirmation in Step 3** - Do NOT proceed to Step 4 without explicit approval
|
||||||
|
6. **MUST invoke codeagent-wrapper --parallel for Step 4 execution** - Use Bash tool, NOT Edit/Write or Task tool
|
||||||
|
|
||||||
|
**Violation of any constraint above invalidates the entire workflow. Stop and restart if violated.**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
**Core Responsibilities**
|
**Core Responsibilities**
|
||||||
- Orchestrate a streamlined 6-step development workflow:
|
- Orchestrate a streamlined 6-step development workflow:
|
||||||
1. Requirement clarification through targeted questioning
|
1. Requirement clarification through targeted questioning
|
||||||
@@ -15,14 +31,35 @@ You are the /dev Workflow Orchestrator, an expert development workflow manager s
|
|||||||
6. Completion summary
|
6. Completion summary
|
||||||
|
|
||||||
**Workflow Execution**
|
**Workflow Execution**
|
||||||
- **Step 1: Requirement Clarification**
|
- **Step 1: Requirement Clarification [MANDATORY - DO NOT SKIP]**
|
||||||
- Use AskUserQuestion to clarify requirements directly
|
- MUST use AskUserQuestion tool as the FIRST action - no exceptions
|
||||||
- Focus questions on functional boundaries, inputs/outputs, constraints, testing, and required unit-test coverage levels
|
- Focus questions on functional boundaries, inputs/outputs, constraints, testing, and required unit-test coverage levels
|
||||||
- Iterate 2-3 rounds until clear; rely on judgment; keep questions concise
|
- Iterate 2-3 rounds until clear; rely on judgment; keep questions concise
|
||||||
|
- After clarification complete: MUST use TodoWrite to create task tracking list with workflow steps
|
||||||
|
|
||||||
- **Step 2: codeagent Deep Analysis (Plan Mode Style)**
|
- **Step 2: codeagent-wrapper Deep Analysis (Plan Mode Style) [USE CODEAGENT-WRAPPER ONLY]**
|
||||||
|
|
||||||
Use codeagent Skill to perform deep analysis. codeagent should operate in "plan mode" style and must include UI detection:
|
MUST use Bash tool to invoke `codeagent-wrapper` for deep analysis. Do NOT use Read/Glob/Grep tools directly - delegate all exploration to codeagent-wrapper.
|
||||||
|
|
||||||
|
**How to invoke for analysis**:
|
||||||
|
```bash
|
||||||
|
codeagent-wrapper --backend codex - <<'EOF'
|
||||||
|
Analyze the codebase for implementing [feature name].
|
||||||
|
|
||||||
|
Requirements:
|
||||||
|
- [requirement 1]
|
||||||
|
- [requirement 2]
|
||||||
|
|
||||||
|
Deliverables:
|
||||||
|
1. Explore codebase structure and existing patterns
|
||||||
|
2. Evaluate implementation options with trade-offs
|
||||||
|
3. Make architectural decisions
|
||||||
|
4. Break down into 2-5 parallelizable tasks with dependencies
|
||||||
|
5. Determine if UI work is needed (check for .css/.tsx/.vue files)
|
||||||
|
|
||||||
|
Output the analysis following the structure below.
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
**When Deep Analysis is Needed** (any condition triggers):
|
**When Deep Analysis is Needed** (any condition triggers):
|
||||||
- Multiple valid approaches exist (e.g., Redis vs in-memory vs file-based caching)
|
- Multiple valid approaches exist (e.g., Redis vs in-memory vs file-based caching)
|
||||||
@@ -34,7 +71,7 @@ You are the /dev Workflow Orchestrator, an expert development workflow manager s
|
|||||||
- During analysis, output whether the task needs UI work (yes/no) and the evidence
|
- During analysis, output whether the task needs UI work (yes/no) and the evidence
|
||||||
- UI criteria: presence of style assets (.css, .scss, styled-components, CSS modules, tailwindcss) OR frontend component files (.tsx, .jsx, .vue)
|
- UI criteria: presence of style assets (.css, .scss, styled-components, CSS modules, tailwindcss) OR frontend component files (.tsx, .jsx, .vue)
|
||||||
|
|
||||||
**What codeagent Does in Analysis Mode**:
|
**What the AI backend does in Analysis Mode** (when invoked via codeagent-wrapper):
|
||||||
1. **Explore Codebase**: Use Glob, Grep, Read to understand structure, patterns, architecture
|
1. **Explore Codebase**: Use Glob, Grep, Read to understand structure, patterns, architecture
|
||||||
2. **Identify Existing Patterns**: Find how similar features are implemented, reuse conventions
|
2. **Identify Existing Patterns**: Find how similar features are implemented, reuse conventions
|
||||||
3. **Evaluate Options**: When multiple approaches exist, list trade-offs (complexity, performance, security, maintainability)
|
3. **Evaluate Options**: When multiple approaches exist, list trade-offs (complexity, performance, security, maintainability)
|
||||||
@@ -81,27 +118,39 @@ You are the /dev Workflow Orchestrator, an expert development workflow manager s
|
|||||||
- Options: "Confirm and execute" / "Need adjustments"
|
- Options: "Confirm and execute" / "Need adjustments"
|
||||||
- If user chooses "Need adjustments", return to Step 1 or Step 2 based on feedback
|
- If user chooses "Need adjustments", return to Step 1 or Step 2 based on feedback
|
||||||
|
|
||||||
- **Step 4: Parallel Development Execution**
|
- **Step 4: Parallel Development Execution [CODEAGENT-WRAPPER ONLY - NO DIRECT EDITS]**
|
||||||
- For each task in `dev-plan.md`, invoke codeagent skill with task brief in HEREDOC format:
|
- MUST use Bash tool to invoke `codeagent-wrapper --parallel` for ALL code changes
|
||||||
|
- NEVER use Edit, Write, MultiEdit, or Task tools to modify code directly
|
||||||
|
- Build ONE `--parallel` config that includes all tasks in `dev-plan.md` and submit it once via Bash tool:
|
||||||
```bash
|
```bash
|
||||||
# Backend task (use codex backend - default)
|
# One shot submission - wrapper handles topology + concurrency
|
||||||
codeagent-wrapper --backend codex - <<'EOF'
|
codeagent-wrapper --parallel <<'EOF'
|
||||||
Task: [task-id]
|
---TASK---
|
||||||
|
id: [task-id-1]
|
||||||
|
backend: codex
|
||||||
|
workdir: .
|
||||||
|
dependencies: [optional, comma-separated ids]
|
||||||
|
---CONTENT---
|
||||||
|
Task: [task-id-1]
|
||||||
Reference: @.claude/specs/{feature_name}/dev-plan.md
|
Reference: @.claude/specs/{feature_name}/dev-plan.md
|
||||||
Scope: [task file scope]
|
Scope: [task file scope]
|
||||||
Test: [test command]
|
Test: [test command]
|
||||||
Deliverables: code + unit tests + coverage ≥90% + coverage summary
|
Deliverables: code + unit tests + coverage ≥90% + coverage summary
|
||||||
EOF
|
|
||||||
|
|
||||||
# UI task (use gemini backend - enforced)
|
---TASK---
|
||||||
codeagent-wrapper --backend gemini - <<'EOF'
|
id: [task-id-2]
|
||||||
Task: [task-id]
|
backend: gemini
|
||||||
|
workdir: .
|
||||||
|
dependencies: [optional, comma-separated ids]
|
||||||
|
---CONTENT---
|
||||||
|
Task: [task-id-2]
|
||||||
Reference: @.claude/specs/{feature_name}/dev-plan.md
|
Reference: @.claude/specs/{feature_name}/dev-plan.md
|
||||||
Scope: [task file scope]
|
Scope: [task file scope]
|
||||||
Test: [test command]
|
Test: [test command]
|
||||||
Deliverables: code + unit tests + coverage ≥90% + coverage summary
|
Deliverables: code + unit tests + coverage ≥90% + coverage summary
|
||||||
EOF
|
EOF
|
||||||
```
|
```
|
||||||
|
- **Note**: Use `workdir: .` (current directory) for all tasks unless specific subdirectory is required
|
||||||
- Execute independent tasks concurrently; serialize conflicting ones; track coverage reports
|
- Execute independent tasks concurrently; serialize conflicting ones; track coverage reports
|
||||||
|
|
||||||
- **Step 5: Coverage Validation**
|
- **Step 5: Coverage Validation**
|
||||||
@@ -113,9 +162,13 @@ You are the /dev Workflow Orchestrator, an expert development workflow manager s
|
|||||||
- Provide completed task list, coverage per task, key file changes
|
- Provide completed task list, coverage per task, key file changes
|
||||||
|
|
||||||
**Error Handling**
|
**Error Handling**
|
||||||
- codeagent failure: retry once, then log and continue
|
- **codeagent-wrapper failure**: Retry once with same input; if still fails, log error and ask user for guidance
|
||||||
- Insufficient coverage: request more tests (max 2 rounds)
|
- **Insufficient coverage (<90%)**: Request more tests from the failed task (max 2 rounds); if still fails, report to user
|
||||||
- Dependency conflicts: serialize automatically
|
- **Dependency conflicts**:
|
||||||
|
- Circular dependencies: codeagent-wrapper will detect and fail with error; revise task breakdown to remove cycles
|
||||||
|
- Missing dependencies: Ensure all task IDs referenced in `dependencies` field exist
|
||||||
|
- **Parallel execution timeout**: Individual tasks timeout after 2 hours (configurable via CODEX_TIMEOUT); failed tasks can be retried individually
|
||||||
|
- **Backend unavailable**: If codex/claude/gemini CLI not found, fail immediately with clear error message
|
||||||
|
|
||||||
**Quality Standards**
|
**Quality Standards**
|
||||||
- Code coverage ≥90%
|
- Code coverage ≥90%
|
||||||
|
|||||||
@@ -5,7 +5,7 @@
|
|||||||
set -e
|
set -e
|
||||||
|
|
||||||
# Get staged files
|
# Get staged files
|
||||||
STAGED_FILES=$(git diff --cached --name-only --diff-filter=ACM)
|
STAGED_FILES="$(git diff --cached --name-only --diff-filter=ACM)"
|
||||||
|
|
||||||
if [ -z "$STAGED_FILES" ]; then
|
if [ -z "$STAGED_FILES" ]; then
|
||||||
echo "No files to validate"
|
echo "No files to validate"
|
||||||
@@ -15,17 +15,32 @@ fi
|
|||||||
echo "Running pre-commit checks..."
|
echo "Running pre-commit checks..."
|
||||||
|
|
||||||
# Check Go files
|
# Check Go files
|
||||||
GO_FILES=$(echo "$STAGED_FILES" | grep '\.go$' || true)
|
GO_FILES="$(printf '%s\n' "$STAGED_FILES" | grep '\.go$' || true)"
|
||||||
if [ -n "$GO_FILES" ]; then
|
if [ -n "$GO_FILES" ]; then
|
||||||
echo "Checking Go files..."
|
echo "Checking Go files..."
|
||||||
|
|
||||||
|
if ! command -v gofmt &> /dev/null; then
|
||||||
|
echo "❌ gofmt not found. Please install Go (gofmt is included with the Go toolchain)."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
# Format check
|
# Format check
|
||||||
gofmt -l $GO_FILES | while read -r file; do
|
GO_FILE_ARGS=()
|
||||||
|
while IFS= read -r file; do
|
||||||
if [ -n "$file" ]; then
|
if [ -n "$file" ]; then
|
||||||
echo "❌ $file needs formatting (run: gofmt -w $file)"
|
GO_FILE_ARGS+=("$file")
|
||||||
|
fi
|
||||||
|
done <<< "$GO_FILES"
|
||||||
|
|
||||||
|
if [ "${#GO_FILE_ARGS[@]}" -gt 0 ]; then
|
||||||
|
UNFORMATTED="$(gofmt -l "${GO_FILE_ARGS[@]}")"
|
||||||
|
if [ -n "$UNFORMATTED" ]; then
|
||||||
|
echo "❌ The following files need formatting:"
|
||||||
|
echo "$UNFORMATTED"
|
||||||
|
echo "Run: gofmt -w <file>"
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
done
|
fi
|
||||||
|
|
||||||
# Run tests
|
# Run tests
|
||||||
if command -v go &> /dev/null; then
|
if command -v go &> /dev/null; then
|
||||||
@@ -38,19 +53,26 @@ if [ -n "$GO_FILES" ]; then
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
# Check JSON files
|
# Check JSON files
|
||||||
JSON_FILES=$(echo "$STAGED_FILES" | grep '\.json$' || true)
|
JSON_FILES="$(printf '%s\n' "$STAGED_FILES" | grep '\.json$' || true)"
|
||||||
if [ -n "$JSON_FILES" ]; then
|
if [ -n "$JSON_FILES" ]; then
|
||||||
echo "Validating JSON files..."
|
echo "Validating JSON files..."
|
||||||
for file in $JSON_FILES; do
|
if ! command -v jq &> /dev/null; then
|
||||||
|
echo "❌ jq not found. Please install jq to validate JSON files."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
while IFS= read -r file; do
|
||||||
|
if [ -z "$file" ]; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
if ! jq empty "$file" 2>/dev/null; then
|
if ! jq empty "$file" 2>/dev/null; then
|
||||||
echo "❌ Invalid JSON: $file"
|
echo "❌ Invalid JSON: $file"
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
done
|
done <<< "$JSON_FILES"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Check Markdown files
|
# Check Markdown files
|
||||||
MD_FILES=$(echo "$STAGED_FILES" | grep '\.md$' || true)
|
MD_FILES="$(printf '%s\n' "$STAGED_FILES" | grep '\.md$' || true)"
|
||||||
if [ -n "$MD_FILES" ]; then
|
if [ -n "$MD_FILES" ]; then
|
||||||
echo "Checking markdown files..."
|
echo "Checking markdown files..."
|
||||||
# Add markdown linting if needed
|
# Add markdown linting if needed
|
||||||
|
|||||||
31
install.py
31
install.py
@@ -17,7 +17,10 @@ from datetime import datetime
|
|||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any, Dict, Iterable, List, Optional
|
from typing import Any, Dict, Iterable, List, Optional
|
||||||
|
|
||||||
import jsonschema
|
try:
|
||||||
|
import jsonschema
|
||||||
|
except ImportError: # pragma: no cover
|
||||||
|
jsonschema = None
|
||||||
|
|
||||||
DEFAULT_INSTALL_DIR = "~/.claude"
|
DEFAULT_INSTALL_DIR = "~/.claude"
|
||||||
|
|
||||||
@@ -87,6 +90,32 @@ def load_config(path: str) -> Dict[str, Any]:
|
|||||||
config_path = Path(path).expanduser().resolve()
|
config_path = Path(path).expanduser().resolve()
|
||||||
config = _load_json(config_path)
|
config = _load_json(config_path)
|
||||||
|
|
||||||
|
if jsonschema is None:
|
||||||
|
print(
|
||||||
|
"WARNING: python package 'jsonschema' is not installed; "
|
||||||
|
"skipping config validation. To enable validation run:\n"
|
||||||
|
" python3 -m pip install jsonschema\n",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
|
||||||
|
if not isinstance(config, dict):
|
||||||
|
raise ValueError(
|
||||||
|
f"Config must be a dict, got {type(config).__name__}. "
|
||||||
|
"Check your config.json syntax."
|
||||||
|
)
|
||||||
|
|
||||||
|
required_keys = ["version", "install_dir", "log_file", "modules"]
|
||||||
|
missing = [key for key in required_keys if key not in config]
|
||||||
|
if missing:
|
||||||
|
missing_str = ", ".join(missing)
|
||||||
|
raise ValueError(
|
||||||
|
f"Config missing required keys: {missing_str}. "
|
||||||
|
"Install jsonschema for better validation: "
|
||||||
|
"python3 -m pip install jsonschema"
|
||||||
|
)
|
||||||
|
|
||||||
|
return config
|
||||||
|
|
||||||
schema_candidates = [
|
schema_candidates = [
|
||||||
config_path.parent / "config.schema.json",
|
config_path.parent / "config.schema.json",
|
||||||
Path(__file__).resolve().with_name("config.schema.json"),
|
Path(__file__).resolve().with_name("config.schema.json"),
|
||||||
|
|||||||
35
install.sh
35
install.sh
@@ -1,12 +1,15 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
echo "⚠️ WARNING: install.sh is LEGACY and will be removed in future versions."
|
if [ -z "${SKIP_WARNING:-}" ]; then
|
||||||
echo "Please use the new installation method:"
|
echo "⚠️ WARNING: install.sh is LEGACY and will be removed in future versions."
|
||||||
echo " python3 install.py --install-dir ~/.claude"
|
echo "Please use the new installation method:"
|
||||||
echo ""
|
echo " python3 install.py --install-dir ~/.claude"
|
||||||
echo "Continuing with legacy installation in 5 seconds..."
|
echo ""
|
||||||
sleep 5
|
echo "Set SKIP_WARNING=1 to bypass this message"
|
||||||
|
echo "Continuing with legacy installation in 5 seconds..."
|
||||||
|
sleep 5
|
||||||
|
fi
|
||||||
|
|
||||||
# Detect platform
|
# Detect platform
|
||||||
OS=$(uname -s | tr '[:upper:]' '[:lower:]')
|
OS=$(uname -s | tr '[:upper:]' '[:lower:]')
|
||||||
@@ -31,23 +34,25 @@ if ! curl -fsSL "$URL" -o /tmp/codeagent-wrapper; then
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
mkdir -p "$HOME/bin"
|
INSTALL_DIR="${INSTALL_DIR:-$HOME/.claude}"
|
||||||
|
BIN_DIR="${INSTALL_DIR}/bin"
|
||||||
|
mkdir -p "$BIN_DIR"
|
||||||
|
|
||||||
mv /tmp/codeagent-wrapper "$HOME/bin/codeagent-wrapper"
|
mv /tmp/codeagent-wrapper "${BIN_DIR}/codeagent-wrapper"
|
||||||
chmod +x "$HOME/bin/codeagent-wrapper"
|
chmod +x "${BIN_DIR}/codeagent-wrapper"
|
||||||
|
|
||||||
if "$HOME/bin/codeagent-wrapper" --version >/dev/null 2>&1; then
|
if "${BIN_DIR}/codeagent-wrapper" --version >/dev/null 2>&1; then
|
||||||
echo "codeagent-wrapper installed successfully to ~/bin/codeagent-wrapper"
|
echo "codeagent-wrapper installed successfully to ${BIN_DIR}/codeagent-wrapper"
|
||||||
else
|
else
|
||||||
echo "ERROR: installation verification failed" >&2
|
echo "ERROR: installation verification failed" >&2
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if [[ ":$PATH:" != *":$HOME/bin:"* ]]; then
|
if [[ ":$PATH:" != *":${BIN_DIR}:"* ]]; then
|
||||||
echo ""
|
echo ""
|
||||||
echo "WARNING: ~/bin is not in your PATH"
|
echo "WARNING: ${BIN_DIR} is not in your PATH"
|
||||||
echo "Add this line to your ~/.bashrc or ~/.zshrc:"
|
echo "Add this line to your ~/.bashrc or ~/.zshrc (then restart your shell):"
|
||||||
echo ""
|
echo ""
|
||||||
echo " export PATH=\"\$HOME/bin:\$PATH\""
|
echo " export PATH=\"${BIN_DIR}:\$PATH\""
|
||||||
echo ""
|
echo ""
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -104,6 +104,10 @@ You adhere to core software engineering principles like KISS (Keep It Simple, St
|
|||||||
|
|
||||||
## Implementation Constraints
|
## Implementation Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, SQL, CRUD, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Working Solution**: Code must fully implement the specified functionality
|
- **Working Solution**: Code must fully implement the specified functionality
|
||||||
- **Integration Compatibility**: Must work seamlessly with existing codebase
|
- **Integration Compatibility**: Must work seamlessly with existing codebase
|
||||||
|
|||||||
@@ -88,6 +88,10 @@ Each phase should be independently deployable and testable.
|
|||||||
|
|
||||||
## Key Constraints
|
## Key Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, SQL, CRUD, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Direct Implementability**: Every item must be directly translatable to code
|
- **Direct Implementability**: Every item must be directly translatable to code
|
||||||
- **Specific Technical Details**: Include exact file paths, function names, table schemas
|
- **Specific Technical Details**: Include exact file paths, function names, table schemas
|
||||||
|
|||||||
@@ -176,6 +176,10 @@ You adhere to core software engineering principles like KISS (Keep It Simple, St
|
|||||||
|
|
||||||
## Key Constraints
|
## Key Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, E2E, CI/CD, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Functional Verification**: Verify all specified functionality works
|
- **Functional Verification**: Verify all specified functionality works
|
||||||
- **Integration Testing**: Ensure seamless integration with existing code
|
- **Integration Testing**: Ensure seamless integration with existing code
|
||||||
|
|||||||
@@ -199,6 +199,10 @@ func TestAPIEndpoint(t *testing.T) {
|
|||||||
|
|
||||||
## Key Constraints
|
## Key Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, E2E, CI/CD, Mock, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Specification Coverage**: Must test all requirements from `./.claude/specs/{feature_name}/requirements-spec.md`
|
- **Specification Coverage**: Must test all requirements from `./.claude/specs/{feature_name}/requirements-spec.md`
|
||||||
- **Critical Path Testing**: Must test all critical business functionality
|
- **Critical Path Testing**: Must test all critical business functionality
|
||||||
|
|||||||
@@ -74,7 +74,7 @@ codeagent-wrapper --backend gemini "simple task"
|
|||||||
- `task` (required): Task description, supports `@file` references
|
- `task` (required): Task description, supports `@file` references
|
||||||
- `working_dir` (optional): Working directory (default: current)
|
- `working_dir` (optional): Working directory (default: current)
|
||||||
- `--backend` (optional): Select AI backend (codex/claude/gemini, default: codex)
|
- `--backend` (optional): Select AI backend (codex/claude/gemini, default: codex)
|
||||||
- **Note**: Claude backend defaults to `--dangerously-skip-permissions` for automation compatibility
|
- **Note**: Claude backend only adds `--dangerously-skip-permissions` when explicitly enabled
|
||||||
|
|
||||||
## Return Format
|
## Return Format
|
||||||
|
|
||||||
@@ -147,9 +147,9 @@ Set `CODEAGENT_MAX_PARALLEL_WORKERS` to limit concurrent tasks (default: unlimit
|
|||||||
## Environment Variables
|
## Environment Variables
|
||||||
|
|
||||||
- `CODEX_TIMEOUT`: Override timeout in milliseconds (default: 7200000 = 2 hours)
|
- `CODEX_TIMEOUT`: Override timeout in milliseconds (default: 7200000 = 2 hours)
|
||||||
- `CODEAGENT_SKIP_PERMISSIONS`: Control permission checks
|
- `CODEAGENT_SKIP_PERMISSIONS`: Control Claude CLI permission checks
|
||||||
- For **Claude** backend: Set to `true`/`1` to **disable** `--dangerously-skip-permissions` (default: enabled)
|
- For **Claude** backend: Set to `true`/`1` to add `--dangerously-skip-permissions` (default: disabled)
|
||||||
- For **Codex/Gemini** backends: Set to `true`/`1` to enable permission skipping (default: disabled)
|
- For **Codex/Gemini** backends: Currently has no effect
|
||||||
- `CODEAGENT_MAX_PARALLEL_WORKERS`: Limit concurrent tasks in parallel mode (default: unlimited, recommended: 8)
|
- `CODEAGENT_MAX_PARALLEL_WORKERS`: Limit concurrent tasks in parallel mode (default: unlimited, recommended: 8)
|
||||||
|
|
||||||
## Invocation Pattern
|
## Invocation Pattern
|
||||||
@@ -182,9 +182,8 @@ Bash tool parameters:
|
|||||||
|
|
||||||
## Security Best Practices
|
## Security Best Practices
|
||||||
|
|
||||||
- **Claude Backend**: Defaults to `--dangerously-skip-permissions` for automation workflows
|
- **Claude Backend**: Permission checks enabled by default
|
||||||
- To enforce permission checks with Claude: Set `CODEAGENT_SKIP_PERMISSIONS=true`
|
- To skip checks: set `CODEAGENT_SKIP_PERMISSIONS=true` or pass `--skip-permissions`
|
||||||
- **Codex/Gemini Backends**: Permission checks enabled by default
|
|
||||||
- **Concurrency Limits**: Set `CODEAGENT_MAX_PARALLEL_WORKERS` in production to prevent resource exhaustion
|
- **Concurrency Limits**: Set `CODEAGENT_MAX_PARALLEL_WORKERS` in production to prevent resource exhaustion
|
||||||
- **Automation Context**: This wrapper is designed for AI-driven automation where permission prompts would block execution
|
- **Automation Context**: This wrapper is designed for AI-driven automation where permission prompts would block execution
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user