Compare commits

..

2 Commits

Author SHA1 Message Date
cexll
4dd735034e fix: parser critical bugs and add PR #86 compatibility
Fixes 3 critical bugs in parser.go:

1. Gemini detection: Change Delta from bool to *bool for proper field
   presence detection. Fixes issue where delta:false events were lost.

2. Codex detection: Tighten logic to only classify as Codex if
   thread_id exists OR item.type is non-empty. Prevents Claude events
   with item:null/item:{} from being misclassified and dropped.

3. Performance: Move itemHeader parsing inside item.completed case,
   only parse when actually needed.

Additional changes:
- Implement PR #86's Gemini notifyMessage trigger on Status field
  instead of Content, resolving parser.go conflict between PRs
- Add regression tests for all fixed scenarios
- All tests pass: go test ./...

Co-authored-by: Codeagent (Codex)

Generated with SWE-Agent.ai

Co-Authored-By: SWE-Agent.ai <noreply@swe-agent.ai>
2025-12-21 13:55:48 +08:00
徐文彬
a08dd62b59 Parser重复解析问题优化 2025-12-21 00:29:07 +08:00
19 changed files with 146 additions and 1009 deletions

View File

@@ -97,6 +97,11 @@ jobs:
with: with:
path: artifacts path: artifacts
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Prepare release files - name: Prepare release files
run: | run: |
mkdir -p release mkdir -p release
@@ -104,10 +109,26 @@ jobs:
cp install.sh install.bat release/ cp install.sh install.bat release/
ls -la release/ ls -la release/
- name: Generate release notes with git-cliff
run: |
# Install git-cliff via npx
npx git-cliff@latest --current --strip all -o release_notes.md
# Fallback if generation failed
if [ ! -s release_notes.md ]; then
echo "⚠️ Failed to generate release notes with git-cliff" > release_notes.md
echo "" >> release_notes.md
echo "## What's Changed" >> release_notes.md
echo "See commits in this release for details." >> release_notes.md
fi
echo "--- Generated Release Notes ---"
cat release_notes.md
- name: Create Release - name: Create Release
uses: softprops/action-gh-release@v2 uses: softprops/action-gh-release@v2
with: with:
files: release/* files: release/*
generate_release_notes: true body_path: release_notes.md
draft: false draft: false
prerelease: false prerelease: false

154
README.md
View File

@@ -132,59 +132,6 @@ Requirements → Architecture → Sprint Plan → Development → Review → QA
--- ---
## Version Requirements
### Codex CLI
**Minimum version:** Check compatibility with your installation
The codeagent-wrapper uses these Codex CLI features:
- `codex e` - Execute commands (shorthand for `codex exec`)
- `--skip-git-repo-check` - Skip git repository validation
- `--json` - JSON stream output format
- `-C <workdir>` - Set working directory
- `resume <session_id>` - Resume previous sessions
**Verify Codex CLI is installed:**
```bash
which codex
codex --version
```
### Claude CLI
**Minimum version:** Check compatibility with your installation
Required features:
- `--output-format stream-json` - Streaming JSON output format
- `--setting-sources` - Control setting sources (prevents infinite recursion)
- `--dangerously-skip-permissions` - Skip permission prompts (use with caution)
- `-p` - Prompt input flag
- `-r <session_id>` - Resume sessions
**Security Note:** The wrapper only adds `--dangerously-skip-permissions` for Claude when explicitly enabled (e.g. `--skip-permissions` / `CODEAGENT_SKIP_PERMISSIONS=true`). Keep it disabled unless you understand the risk.
**Verify Claude CLI is installed:**
```bash
which claude
claude --version
```
### Gemini CLI
**Minimum version:** Check compatibility with your installation
Required features:
- `-o stream-json` - JSON stream output format
- `-y` - Auto-approve prompts (non-interactive mode)
- `-r <session_id>` - Resume sessions
- `-p` - Prompt input flag
**Verify Gemini CLI is installed:**
```bash
which gemini
gemini --version
```
---
## Installation ## Installation
### Modular Installation (Recommended) ### Modular Installation (Recommended)
@@ -216,39 +163,15 @@ python3 install.py --force
``` ```
~/.claude/ ~/.claude/
├── bin/ ├── CLAUDE.md # Core instructions and role definition
│ └── codeagent-wrapper # Main executable ├── commands/ # Slash commands (/dev, /code, etc.)
├── CLAUDE.md # Core instructions and role definition ├── agents/ # Agent definitions
├── commands/ # Slash commands (/dev, /code, etc.)
├── agents/ # Agent definitions
├── skills/ ├── skills/
│ └── codex/ │ └── codex/
│ └── SKILL.md # Codex integration skill │ └── SKILL.md # Codex integration skill
── config.json # Configuration ── installed_modules.json # Installation status
└── installed_modules.json # Installation status
``` ```
### Customizing Installation Directory
By default, myclaude installs to `~/.claude`. You can customize this using the `INSTALL_DIR` environment variable:
```bash
# Install to custom directory
INSTALL_DIR=/opt/myclaude bash install.sh
# Update your PATH accordingly
export PATH="/opt/myclaude/bin:$PATH"
```
**Directory Structure:**
- `$INSTALL_DIR/bin/` - codeagent-wrapper binary
- `$INSTALL_DIR/skills/` - Skill definitions
- `$INSTALL_DIR/config.json` - Configuration file
- `$INSTALL_DIR/commands/` - Slash command definitions
- `$INSTALL_DIR/agents/` - Agent definitions
**Note:** When using a custom installation directory, ensure that `$INSTALL_DIR/bin` is added to your `PATH` environment variable.
### Configuration ### Configuration
Edit `config.json` to customize: Edit `config.json` to customize:
@@ -372,7 +295,7 @@ setx PATH "%USERPROFILE%\bin;%PATH%"
**Codex wrapper not found:** **Codex wrapper not found:**
```bash ```bash
# Check PATH # Check PATH
echo $PATH | grep -q "$HOME/.claude/bin" || echo 'export PATH="$HOME/.claude/bin:$PATH"' >> ~/.zshrc echo $PATH | grep -q "$HOME/bin" || echo 'export PATH="$HOME/bin:$PATH"' >> ~/.zshrc
# Reinstall # Reinstall
bash install.sh bash install.sh
@@ -392,71 +315,6 @@ cat ~/.claude/installed_modules.json
python3 install.py --module dev --force python3 install.py --module dev --force
``` ```
### Version Compatibility Issues
**Backend CLI not found:**
```bash
# Check if backend CLIs are installed
which codex
which claude
which gemini
# Install missing backends
# Codex: Follow installation instructions at https://codex.docs
# Claude: Follow installation instructions at https://claude.ai/docs
# Gemini: Follow installation instructions at https://ai.google.dev/docs
```
**Unsupported CLI flags:**
```bash
# If you see errors like "unknown flag" or "invalid option"
# Check backend CLI version
codex --version
claude --version
gemini --version
# For Codex: Ensure it supports `e`, `--skip-git-repo-check`, `--json`, `-C`, and `resume`
# For Claude: Ensure it supports `--output-format stream-json`, `--setting-sources`, `-r`
# For Gemini: Ensure it supports `-o stream-json`, `-y`, `-r`, `-p`
# Update your backend CLI to the latest version if needed
```
**JSON parsing errors:**
```bash
# If you see "failed to parse JSON output" errors
# Verify the backend outputs stream-json format
codex e --json "test task" # Should output newline-delimited JSON
claude --output-format stream-json -p "test" # Should output stream JSON
# If not, your backend CLI version may be too old or incompatible
```
**Infinite recursion with Claude backend:**
```bash
# The wrapper prevents this with `--setting-sources ""` flag
# If you still see recursion, ensure your Claude CLI supports this flag
claude --help | grep "setting-sources"
# If flag is not supported, upgrade Claude CLI
```
**Session resume failures:**
```bash
# Check if session ID is valid
codex history # List recent sessions
claude history
# Ensure backend CLI supports session resumption
codex resume <session_id> "test" # Should continue from previous session
claude -r <session_id> "test"
# If not supported, use new sessions instead of resume mode
```
--- ---
## Documentation ## Documentation

View File

@@ -152,39 +152,15 @@ python3 install.py --force
``` ```
~/.claude/ ~/.claude/
├── bin/ ├── CLAUDE.md # 核心指令和角色定义
│ └── codeagent-wrapper # 主可执行文件 ├── commands/ # 斜杠命令 (/dev, /code 等)
├── CLAUDE.md # 核心指令和角色定义 ├── agents/ # 智能体定义
├── commands/ # 斜杠命令 (/dev, /code 等)
├── agents/ # 智能体定义
├── skills/ ├── skills/
│ └── codex/ │ └── codex/
│ └── SKILL.md # Codex 集成技能 │ └── SKILL.md # Codex 集成技能
── config.json # 配置文件 ── installed_modules.json # 安装状态
└── installed_modules.json # 安装状态
``` ```
### 自定义安装目录
默认情况下myclaude 安装到 `~/.claude`。您可以使用 `INSTALL_DIR` 环境变量自定义安装目录:
```bash
# 安装到自定义目录
INSTALL_DIR=/opt/myclaude bash install.sh
# 相应更新您的 PATH
export PATH="/opt/myclaude/bin:$PATH"
```
**目录结构:**
- `$INSTALL_DIR/bin/` - codeagent-wrapper 可执行文件
- `$INSTALL_DIR/skills/` - 技能定义
- `$INSTALL_DIR/config.json` - 配置文件
- `$INSTALL_DIR/commands/` - 斜杠命令定义
- `$INSTALL_DIR/agents/` - 智能体定义
**注意:** 使用自定义安装目录时,请确保将 `$INSTALL_DIR/bin` 添加到您的 `PATH` 环境变量中。
### 配置 ### 配置
编辑 `config.json` 自定义: 编辑 `config.json` 自定义:
@@ -308,7 +284,7 @@ setx PATH "%USERPROFILE%\bin;%PATH%"
**Codex wrapper 未找到:** **Codex wrapper 未找到:**
```bash ```bash
# 检查 PATH # 检查 PATH
echo $PATH | grep -q "$HOME/.claude/bin" || echo 'export PATH="$HOME/.claude/bin:$PATH"' >> ~/.zshrc echo $PATH | grep -q "$HOME/bin" || echo 'export PATH="$HOME/bin:$PATH"' >> ~/.zshrc
# 重新安装 # 重新安装
bash install.sh bash install.sh

View File

@@ -1,11 +1,5 @@
package main package main
import (
"encoding/json"
"os"
"path/filepath"
)
// Backend defines the contract for invoking different AI CLI backends. // Backend defines the contract for invoking different AI CLI backends.
// Each backend is responsible for supplying the executable command and // Each backend is responsible for supplying the executable command and
// building the argument list based on the wrapper config. // building the argument list based on the wrapper config.
@@ -32,62 +26,15 @@ func (ClaudeBackend) Command() string {
return "claude" return "claude"
} }
func (ClaudeBackend) BuildArgs(cfg *Config, targetArg string) []string { func (ClaudeBackend) BuildArgs(cfg *Config, targetArg string) []string {
return buildClaudeArgs(cfg, targetArg)
}
const maxClaudeSettingsBytes = 1 << 20 // 1MB
// loadMinimalEnvSettings 从 ~/.claude/settings.json 只提取 env 配置。
// 只接受字符串类型的值;文件缺失/解析失败/超限都返回空。
func loadMinimalEnvSettings() map[string]string {
home, err := os.UserHomeDir()
if err != nil || home == "" {
return nil
}
settingPath := filepath.Join(home, ".claude", "settings.json")
info, err := os.Stat(settingPath)
if err != nil || info.Size() > maxClaudeSettingsBytes {
return nil
}
data, err := os.ReadFile(settingPath)
if err != nil {
return nil
}
var cfg struct {
Env map[string]any `json:"env"`
}
if err := json.Unmarshal(data, &cfg); err != nil {
return nil
}
if len(cfg.Env) == 0 {
return nil
}
env := make(map[string]string, len(cfg.Env))
for k, v := range cfg.Env {
s, ok := v.(string)
if !ok {
continue
}
env[k] = s
}
if len(env) == 0 {
return nil
}
return env
}
func buildClaudeArgs(cfg *Config, targetArg string) []string {
if cfg == nil { if cfg == nil {
return nil return nil
} }
args := []string{"-p"} args := []string{"-p", "--dangerously-skip-permissions"}
if cfg.SkipPermissions {
args = append(args, "--dangerously-skip-permissions") // Only skip permissions when explicitly requested
} // if cfg.SkipPermissions {
// args = append(args, "--dangerously-skip-permissions")
// }
// Prevent infinite recursion: disable all setting sources (user, project, local) // Prevent infinite recursion: disable all setting sources (user, project, local)
// This ensures a clean execution environment without CLAUDE.md or skills that would trigger codeagent // This ensures a clean execution environment without CLAUDE.md or skills that would trigger codeagent
@@ -113,10 +60,6 @@ func (GeminiBackend) Command() string {
return "gemini" return "gemini"
} }
func (GeminiBackend) BuildArgs(cfg *Config, targetArg string) []string { func (GeminiBackend) BuildArgs(cfg *Config, targetArg string) []string {
return buildGeminiArgs(cfg, targetArg)
}
func buildGeminiArgs(cfg *Config, targetArg string) []string {
if cfg == nil { if cfg == nil {
return nil return nil
} }

View File

@@ -1,9 +1,6 @@
package main package main
import ( import (
"bytes"
"os"
"path/filepath"
"reflect" "reflect"
"testing" "testing"
) )
@@ -11,16 +8,16 @@ import (
func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) { func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
backend := ClaudeBackend{} backend := ClaudeBackend{}
t.Run("new mode omits skip-permissions by default", func(t *testing.T) { t.Run("new mode uses workdir without skip by default", func(t *testing.T) {
cfg := &Config{Mode: "new", WorkDir: "/repo"} cfg := &Config{Mode: "new", WorkDir: "/repo"}
got := backend.BuildArgs(cfg, "todo") got := backend.BuildArgs(cfg, "todo")
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"} want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"}
if !reflect.DeepEqual(got, want) { if !reflect.DeepEqual(got, want) {
t.Fatalf("got %v, want %v", got, want) t.Fatalf("got %v, want %v", got, want)
} }
}) })
t.Run("new mode can opt-in skip-permissions", func(t *testing.T) { t.Run("new mode opt-in skip permissions with default workdir", func(t *testing.T) {
cfg := &Config{Mode: "new", SkipPermissions: true} cfg := &Config{Mode: "new", SkipPermissions: true}
got := backend.BuildArgs(cfg, "-") got := backend.BuildArgs(cfg, "-")
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "-"} want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "-"}
@@ -29,10 +26,10 @@ func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
} }
}) })
t.Run("resume mode includes session id", func(t *testing.T) { t.Run("resume mode uses session id and omits workdir", func(t *testing.T) {
cfg := &Config{Mode: "resume", SessionID: "sid-123", WorkDir: "/ignored"} cfg := &Config{Mode: "resume", SessionID: "sid-123", WorkDir: "/ignored"}
got := backend.BuildArgs(cfg, "resume-task") got := backend.BuildArgs(cfg, "resume-task")
want := []string{"-p", "--setting-sources", "", "-r", "sid-123", "--output-format", "stream-json", "--verbose", "resume-task"} want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "-r", "sid-123", "--output-format", "stream-json", "--verbose", "resume-task"}
if !reflect.DeepEqual(got, want) { if !reflect.DeepEqual(got, want) {
t.Fatalf("got %v, want %v", got, want) t.Fatalf("got %v, want %v", got, want)
} }
@@ -41,16 +38,7 @@ func TestClaudeBuildArgs_ModesAndPermissions(t *testing.T) {
t.Run("resume mode without session still returns base flags", func(t *testing.T) { t.Run("resume mode without session still returns base flags", func(t *testing.T) {
cfg := &Config{Mode: "resume", WorkDir: "/ignored"} cfg := &Config{Mode: "resume", WorkDir: "/ignored"}
got := backend.BuildArgs(cfg, "follow-up") got := backend.BuildArgs(cfg, "follow-up")
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "follow-up"} want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "follow-up"}
if !reflect.DeepEqual(got, want) {
t.Fatalf("got %v, want %v", got, want)
}
})
t.Run("resume mode can opt-in skip permissions", func(t *testing.T) {
cfg := &Config{Mode: "resume", SessionID: "sid-123", SkipPermissions: true}
got := backend.BuildArgs(cfg, "resume-task")
want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "-r", "sid-123", "--output-format", "stream-json", "--verbose", "resume-task"}
if !reflect.DeepEqual(got, want) { if !reflect.DeepEqual(got, want) {
t.Fatalf("got %v, want %v", got, want) t.Fatalf("got %v, want %v", got, want)
} }
@@ -101,11 +89,7 @@ func TestClaudeBuildArgs_GeminiAndCodexModes(t *testing.T) {
} }
}) })
t.Run("codex build args omits bypass flag by default", func(t *testing.T) { t.Run("codex build args passthrough remains intact", func(t *testing.T) {
const key = "CODEX_BYPASS_SANDBOX"
t.Cleanup(func() { os.Unsetenv(key) })
os.Unsetenv(key)
backend := CodexBackend{} backend := CodexBackend{}
cfg := &Config{Mode: "new", WorkDir: "/tmp"} cfg := &Config{Mode: "new", WorkDir: "/tmp"}
got := backend.BuildArgs(cfg, "task") got := backend.BuildArgs(cfg, "task")
@@ -114,20 +98,6 @@ func TestClaudeBuildArgs_GeminiAndCodexModes(t *testing.T) {
t.Fatalf("got %v, want %v", got, want) t.Fatalf("got %v, want %v", got, want)
} }
}) })
t.Run("codex build args includes bypass flag when enabled", func(t *testing.T) {
const key = "CODEX_BYPASS_SANDBOX"
t.Cleanup(func() { os.Unsetenv(key) })
os.Setenv(key, "true")
backend := CodexBackend{}
cfg := &Config{Mode: "new", WorkDir: "/tmp"}
got := backend.BuildArgs(cfg, "task")
want := []string{"e", "--dangerously-bypass-approvals-and-sandbox", "--skip-git-repo-check", "-C", "/tmp", "--json", "task"}
if !reflect.DeepEqual(got, want) {
t.Fatalf("got %v, want %v", got, want)
}
})
} }
func TestClaudeBuildArgs_BackendMetadata(t *testing.T) { func TestClaudeBuildArgs_BackendMetadata(t *testing.T) {
@@ -150,64 +120,3 @@ func TestClaudeBuildArgs_BackendMetadata(t *testing.T) {
} }
} }
} }
func TestLoadMinimalEnvSettings(t *testing.T) {
home := t.TempDir()
t.Setenv("HOME", home)
t.Setenv("USERPROFILE", home)
t.Run("missing file returns empty", func(t *testing.T) {
if got := loadMinimalEnvSettings(); len(got) != 0 {
t.Fatalf("got %v, want empty", got)
}
})
t.Run("valid env returns string map", func(t *testing.T) {
dir := filepath.Join(home, ".claude")
if err := os.MkdirAll(dir, 0o755); err != nil {
t.Fatalf("MkdirAll: %v", err)
}
path := filepath.Join(dir, "settings.json")
data := []byte(`{"env":{"ANTHROPIC_API_KEY":"secret","FOO":"bar"}}`)
if err := os.WriteFile(path, data, 0o600); err != nil {
t.Fatalf("WriteFile: %v", err)
}
got := loadMinimalEnvSettings()
if got["ANTHROPIC_API_KEY"] != "secret" || got["FOO"] != "bar" {
t.Fatalf("got %v, want keys present", got)
}
})
t.Run("non-string values are ignored", func(t *testing.T) {
dir := filepath.Join(home, ".claude")
path := filepath.Join(dir, "settings.json")
data := []byte(`{"env":{"GOOD":"ok","BAD":123,"ALSO_BAD":true}}`)
if err := os.WriteFile(path, data, 0o600); err != nil {
t.Fatalf("WriteFile: %v", err)
}
got := loadMinimalEnvSettings()
if got["GOOD"] != "ok" {
t.Fatalf("got %v, want GOOD=ok", got)
}
if _, ok := got["BAD"]; ok {
t.Fatalf("got %v, want BAD omitted", got)
}
if _, ok := got["ALSO_BAD"]; ok {
t.Fatalf("got %v, want ALSO_BAD omitted", got)
}
})
t.Run("oversized file returns empty", func(t *testing.T) {
dir := filepath.Join(home, ".claude")
path := filepath.Join(dir, "settings.json")
data := bytes.Repeat([]byte("a"), maxClaudeSettingsBytes+1)
if err := os.WriteFile(path, data, 0o600); err != nil {
t.Fatalf("WriteFile: %v", err)
}
if got := loadMinimalEnvSettings(); len(got) != 0 {
t.Fatalf("got %v, want empty", got)
}
})
}

View File

@@ -13,16 +13,6 @@ import (
"time" "time"
) )
func stripTimestampPrefix(line string) string {
if !strings.HasPrefix(line, "[") {
return line
}
if idx := strings.Index(line, "] "); idx >= 0 {
return line[idx+2:]
}
return line
}
// TestConcurrentStressLogger 高并发压力测试 // TestConcurrentStressLogger 高并发压力测试
func TestConcurrentStressLogger(t *testing.T) { func TestConcurrentStressLogger(t *testing.T) {
if testing.Short() { if testing.Short() {
@@ -89,8 +79,7 @@ func TestConcurrentStressLogger(t *testing.T) {
// 验证日志格式(纯文本,无前缀) // 验证日志格式(纯文本,无前缀)
formatRE := regexp.MustCompile(`^goroutine-\d+-msg-\d+$`) formatRE := regexp.MustCompile(`^goroutine-\d+-msg-\d+$`)
for i, line := range lines[:min(10, len(lines))] { for i, line := range lines[:min(10, len(lines))] {
msg := stripTimestampPrefix(line) if !formatRE.MatchString(line) {
if !formatRE.MatchString(msg) {
t.Errorf("line %d has invalid format: %s", i, line) t.Errorf("line %d has invalid format: %s", i, line)
} }
} }
@@ -302,7 +291,7 @@ func TestLoggerOrderPreservation(t *testing.T) {
sequences := make(map[int][]int) // goroutine ID -> sequence numbers sequences := make(map[int][]int) // goroutine ID -> sequence numbers
for scanner.Scan() { for scanner.Scan() {
line := stripTimestampPrefix(scanner.Text()) line := scanner.Text()
var gid, seq int var gid, seq int
// Parse format: G0-SEQ0001 (without INFO: prefix) // Parse format: G0-SEQ0001 (without INFO: prefix)
_, err := fmt.Sscanf(line, "G%d-SEQ%04d", &gid, &seq) _, err := fmt.Sscanf(line, "G%d-SEQ%04d", &gid, &seq)

View File

@@ -164,9 +164,6 @@ func parseParallelConfig(data []byte) (*ParallelConfig, error) {
if content == "" { if content == "" {
return nil, fmt.Errorf("task block #%d (%q) missing content", taskIndex, task.ID) return nil, fmt.Errorf("task block #%d (%q) missing content", taskIndex, task.ID)
} }
if task.Mode == "resume" && strings.TrimSpace(task.SessionID) == "" {
return nil, fmt.Errorf("task block #%d (%q) has empty session_id", taskIndex, task.ID)
}
if _, exists := seen[task.ID]; exists { if _, exists := seen[task.ID]; exists {
return nil, fmt.Errorf("task block #%d has duplicate id: %s", taskIndex, task.ID) return nil, fmt.Errorf("task block #%d has duplicate id: %s", taskIndex, task.ID)
} }
@@ -235,10 +232,7 @@ func parseArgs() (*Config, error) {
return nil, fmt.Errorf("resume mode requires: resume <session_id> <task>") return nil, fmt.Errorf("resume mode requires: resume <session_id> <task>")
} }
cfg.Mode = "resume" cfg.Mode = "resume"
cfg.SessionID = strings.TrimSpace(args[1]) cfg.SessionID = args[1]
if cfg.SessionID == "" {
return nil, fmt.Errorf("resume mode requires non-empty session_id")
}
cfg.Task = args[2] cfg.Task = args[2]
cfg.ExplicitStdin = (args[2] == "-") cfg.ExplicitStdin = (args[2] == "-")
if len(args) > 3 { if len(args) > 3 {

View File

@@ -16,8 +16,6 @@ import (
"time" "time"
) )
const postMessageTerminateDelay = 1 * time.Second
// commandRunner abstracts exec.Cmd for testability // commandRunner abstracts exec.Cmd for testability
type commandRunner interface { type commandRunner interface {
Start() error Start() error
@@ -26,7 +24,6 @@ type commandRunner interface {
StdinPipe() (io.WriteCloser, error) StdinPipe() (io.WriteCloser, error)
SetStderr(io.Writer) SetStderr(io.Writer)
SetDir(string) SetDir(string)
SetEnv(env map[string]string)
Process() processHandle Process() processHandle
} }
@@ -82,52 +79,6 @@ func (r *realCmd) SetDir(dir string) {
} }
} }
func (r *realCmd) SetEnv(env map[string]string) {
if r == nil || r.cmd == nil || len(env) == 0 {
return
}
merged := make(map[string]string, len(env)+len(os.Environ()))
for _, kv := range os.Environ() {
if kv == "" {
continue
}
idx := strings.IndexByte(kv, '=')
if idx <= 0 {
continue
}
merged[kv[:idx]] = kv[idx+1:]
}
for _, kv := range r.cmd.Env {
if kv == "" {
continue
}
idx := strings.IndexByte(kv, '=')
if idx <= 0 {
continue
}
merged[kv[:idx]] = kv[idx+1:]
}
for k, v := range env {
if strings.TrimSpace(k) == "" {
continue
}
merged[k] = v
}
keys := make([]string, 0, len(merged))
for k := range merged {
keys = append(keys, k)
}
sort.Strings(keys)
out := make([]string, 0, len(keys))
for _, k := range keys {
out = append(out, k+"="+merged[k])
}
r.cmd.Env = out
}
func (r *realCmd) Process() processHandle { func (r *realCmd) Process() processHandle {
if r == nil || r.cmd == nil || r.cmd.Process == nil { if r == nil || r.cmd == nil || r.cmd.Process == nil {
return nil return nil
@@ -556,43 +507,23 @@ func generateFinalOutput(results []TaskResult) string {
} }
func buildCodexArgs(cfg *Config, targetArg string) []string { func buildCodexArgs(cfg *Config, targetArg string) []string {
if cfg == nil { if cfg.Mode == "resume" {
panic("buildCodexArgs: nil config") return []string{
} "e",
"--skip-git-repo-check",
var resumeSessionID string
isResume := cfg.Mode == "resume"
if isResume {
resumeSessionID = strings.TrimSpace(cfg.SessionID)
if resumeSessionID == "" {
logError("invalid config: resume mode requires non-empty session_id")
isResume = false
}
}
args := []string{"e"}
if envFlagEnabled("CODEX_BYPASS_SANDBOX") {
logWarn("CODEX_BYPASS_SANDBOX=true: running without approval/sandbox protection")
args = append(args, "--dangerously-bypass-approvals-and-sandbox")
}
args = append(args, "--skip-git-repo-check")
if isResume {
return append(args,
"--json", "--json",
"resume", "resume",
resumeSessionID, cfg.SessionID,
targetArg, targetArg,
) }
} }
return []string{
return append(args, "e",
"--skip-git-repo-check",
"-C", cfg.WorkDir, "-C", cfg.WorkDir,
"--json", "--json",
targetArg, targetArg,
) }
} }
func runCodexTask(taskSpec TaskSpec, silent bool, timeoutSec int) TaskResult { func runCodexTask(taskSpec TaskSpec, silent bool, timeoutSec int) TaskResult {
@@ -643,12 +574,6 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
cfg.WorkDir = defaultWorkdir cfg.WorkDir = defaultWorkdir
} }
if cfg.Mode == "resume" && strings.TrimSpace(cfg.SessionID) == "" {
result.ExitCode = 1
result.Error = "resume mode requires non-empty session_id"
return result
}
useStdin := taskSpec.UseStdin useStdin := taskSpec.UseStdin
targetArg := taskSpec.Task targetArg := taskSpec.Task
if useStdin { if useStdin {
@@ -748,12 +673,6 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
cmd := newCommandRunner(ctx, commandName, codexArgs...) cmd := newCommandRunner(ctx, commandName, codexArgs...)
if cfg.Backend == "claude" {
if env := loadMinimalEnvSettings(); len(env) > 0 {
cmd.SetEnv(env)
}
}
// For backends that don't support -C flag (claude, gemini), set working directory via cmd.Dir // For backends that don't support -C flag (claude, gemini), set working directory via cmd.Dir
// Codex passes workdir via -C flag, so we skip setting Dir for it to avoid conflicts // Codex passes workdir via -C flag, so we skip setting Dir for it to avoid conflicts
if cfg.Mode != "resume" && commandName != "codex" && cfg.WorkDir != "" { if cfg.Mode != "resume" && commandName != "codex" && cfg.WorkDir != "" {
@@ -810,7 +729,6 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
// Start parse goroutine BEFORE starting the command to avoid race condition // Start parse goroutine BEFORE starting the command to avoid race condition
// where fast-completing commands close stdout before parser starts reading // where fast-completing commands close stdout before parser starts reading
messageSeen := make(chan struct{}, 1) messageSeen := make(chan struct{}, 1)
completeSeen := make(chan struct{}, 1)
parseCh := make(chan parseResult, 1) parseCh := make(chan parseResult, 1)
go func() { go func() {
msg, tid := parseJSONStreamInternal(stdoutReader, logWarnFn, logInfoFn, func() { msg, tid := parseJSONStreamInternal(stdoutReader, logWarnFn, logInfoFn, func() {
@@ -818,16 +736,7 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
case messageSeen <- struct{}{}: case messageSeen <- struct{}{}:
default: default:
} }
}, func() {
select {
case completeSeen <- struct{}{}:
default:
}
}) })
select {
case completeSeen <- struct{}{}:
default:
}
parseCh <- parseResult{message: msg, threadID: tid} parseCh <- parseResult{message: msg, threadID: tid}
}() }()
@@ -864,63 +773,17 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
waitCh := make(chan error, 1) waitCh := make(chan error, 1)
go func() { waitCh <- cmd.Wait() }() go func() { waitCh <- cmd.Wait() }()
var ( var waitErr error
waitErr error var forceKillTimer *forceKillTimer
forceKillTimer *forceKillTimer var ctxCancelled bool
ctxCancelled bool
messageTimer *time.Timer
messageTimerCh <-chan time.Time
forcedAfterComplete bool
terminated bool
messageSeenObserved bool
completeSeenObserved bool
)
waitLoop: select {
for { case waitErr = <-waitCh:
select { case <-ctx.Done():
case waitErr = <-waitCh: ctxCancelled = true
break waitLoop logErrorFn(cancelReason(commandName, ctx))
case <-ctx.Done(): forceKillTimer = terminateCommandFn(cmd)
ctxCancelled = true waitErr = <-waitCh
logErrorFn(cancelReason(commandName, ctx))
if !terminated {
if timer := terminateCommandFn(cmd); timer != nil {
forceKillTimer = timer
terminated = true
}
}
waitErr = <-waitCh
break waitLoop
case <-messageTimerCh:
forcedAfterComplete = true
messageTimerCh = nil
if !terminated {
logWarnFn(fmt.Sprintf("%s output parsed; terminating lingering backend", commandName))
if timer := terminateCommandFn(cmd); timer != nil {
forceKillTimer = timer
terminated = true
}
}
case <-completeSeen:
completeSeenObserved = true
if messageTimer != nil {
continue
}
messageTimer = time.NewTimer(postMessageTerminateDelay)
messageTimerCh = messageTimer.C
case <-messageSeen:
messageSeenObserved = true
}
}
if messageTimer != nil {
if !messageTimer.Stop() {
select {
case <-messageTimer.C:
default:
}
}
} }
if forceKillTimer != nil { if forceKillTimer != nil {
@@ -928,14 +791,10 @@ waitLoop:
} }
var parsed parseResult var parsed parseResult
switch { if ctxCancelled {
case ctxCancelled:
closeWithReason(stdout, stdoutCloseReasonCtx) closeWithReason(stdout, stdoutCloseReasonCtx)
parsed = <-parseCh parsed = <-parseCh
case messageSeenObserved || completeSeenObserved: } else {
closeWithReason(stdout, stdoutCloseReasonWait)
parsed = <-parseCh
default:
drainTimer := time.NewTimer(stdoutDrainTimeout) drainTimer := time.NewTimer(stdoutDrainTimeout)
defer drainTimer.Stop() defer drainTimer.Stop()
@@ -943,11 +802,6 @@ waitLoop:
case parsed = <-parseCh: case parsed = <-parseCh:
closeWithReason(stdout, stdoutCloseReasonWait) closeWithReason(stdout, stdoutCloseReasonWait)
case <-messageSeen: case <-messageSeen:
messageSeenObserved = true
closeWithReason(stdout, stdoutCloseReasonWait)
parsed = <-parseCh
case <-completeSeen:
completeSeenObserved = true
closeWithReason(stdout, stdoutCloseReasonWait) closeWithReason(stdout, stdoutCloseReasonWait)
parsed = <-parseCh parsed = <-parseCh
case <-drainTimer.C: case <-drainTimer.C:
@@ -968,21 +822,17 @@ waitLoop:
} }
if waitErr != nil { if waitErr != nil {
if forcedAfterComplete && parsed.message != "" { if exitErr, ok := waitErr.(*exec.ExitError); ok {
logWarnFn(fmt.Sprintf("%s terminated after delivering output", commandName)) code := exitErr.ExitCode()
} else { logErrorFn(fmt.Sprintf("%s exited with status %d", commandName, code))
if exitErr, ok := waitErr.(*exec.ExitError); ok { result.ExitCode = code
code := exitErr.ExitCode() result.Error = attachStderr(fmt.Sprintf("%s exited with status %d", commandName, code))
logErrorFn(fmt.Sprintf("%s exited with status %d", commandName, code))
result.ExitCode = code
result.Error = attachStderr(fmt.Sprintf("%s exited with status %d", commandName, code))
return result
}
logErrorFn(commandName + " error: " + waitErr.Error())
result.ExitCode = 1
result.Error = attachStderr(commandName + " error: " + waitErr.Error())
return result return result
} }
logErrorFn(commandName + " error: " + waitErr.Error())
result.ExitCode = 1
result.Error = attachStderr(commandName + " error: " + waitErr.Error())
return result
} }
message := parsed.message message := parsed.message

View File

@@ -10,7 +10,6 @@ import (
"os" "os"
"os/exec" "os/exec"
"path/filepath" "path/filepath"
"slices"
"strings" "strings"
"sync" "sync"
"sync/atomic" "sync/atomic"
@@ -87,7 +86,6 @@ type execFakeRunner struct {
process processHandle process processHandle
stdin io.WriteCloser stdin io.WriteCloser
dir string dir string
env map[string]string
waitErr error waitErr error
waitDelay time.Duration waitDelay time.Duration
startErr error startErr error
@@ -130,17 +128,6 @@ func (f *execFakeRunner) StdinPipe() (io.WriteCloser, error) {
} }
func (f *execFakeRunner) SetStderr(io.Writer) {} func (f *execFakeRunner) SetStderr(io.Writer) {}
func (f *execFakeRunner) SetDir(dir string) { f.dir = dir } func (f *execFakeRunner) SetDir(dir string) { f.dir = dir }
func (f *execFakeRunner) SetEnv(env map[string]string) {
if len(env) == 0 {
return
}
if f.env == nil {
f.env = make(map[string]string, len(env))
}
for k, v := range env {
f.env[k] = v
}
}
func (f *execFakeRunner) Process() processHandle { func (f *execFakeRunner) Process() processHandle {
if f.process != nil { if f.process != nil {
return f.process return f.process
@@ -257,10 +244,6 @@ func TestExecutorHelperCoverage(t *testing.T) {
}) })
t.Run("generateFinalOutputAndArgs", func(t *testing.T) { t.Run("generateFinalOutputAndArgs", func(t *testing.T) {
const key = "CODEX_BYPASS_SANDBOX"
t.Cleanup(func() { os.Unsetenv(key) })
os.Unsetenv(key)
out := generateFinalOutput([]TaskResult{ out := generateFinalOutput([]TaskResult{
{TaskID: "ok", ExitCode: 0}, {TaskID: "ok", ExitCode: 0},
{TaskID: "fail", ExitCode: 1, Error: "boom"}, {TaskID: "fail", ExitCode: 1, Error: "boom"},
@@ -274,11 +257,11 @@ func TestExecutorHelperCoverage(t *testing.T) {
} }
args := buildCodexArgs(&Config{Mode: "new", WorkDir: "/tmp"}, "task") args := buildCodexArgs(&Config{Mode: "new", WorkDir: "/tmp"}, "task")
if !slices.Equal(args, []string{"e", "--skip-git-repo-check", "-C", "/tmp", "--json", "task"}) { if len(args) == 0 || args[3] != "/tmp" {
t.Fatalf("unexpected codex args: %+v", args) t.Fatalf("unexpected codex args: %+v", args)
} }
args = buildCodexArgs(&Config{Mode: "resume", SessionID: "sess"}, "target") args = buildCodexArgs(&Config{Mode: "resume", SessionID: "sess"}, "target")
if !slices.Equal(args, []string{"e", "--skip-git-repo-check", "--json", "resume", "sess", "target"}) { if args[3] != "resume" || args[4] != "sess" {
t.Fatalf("unexpected resume args: %+v", args) t.Fatalf("unexpected resume args: %+v", args)
} }
}) })
@@ -315,18 +298,6 @@ func TestExecutorRunCodexTaskWithContext(t *testing.T) {
origRunner := newCommandRunner origRunner := newCommandRunner
defer func() { newCommandRunner = origRunner }() defer func() { newCommandRunner = origRunner }()
t.Run("resumeMissingSessionID", func(t *testing.T) {
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
t.Fatalf("unexpected command execution for invalid resume config")
return nil
}
res := runCodexTaskWithContext(context.Background(), TaskSpec{Task: "payload", WorkDir: ".", Mode: "resume"}, nil, nil, false, false, 1)
if res.ExitCode == 0 || !strings.Contains(res.Error, "session_id") {
t.Fatalf("expected validation error, got %+v", res)
}
})
t.Run("success", func(t *testing.T) { t.Run("success", func(t *testing.T) {
var firstStdout *reasonReadCloser var firstStdout *reasonReadCloser
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner { newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {

View File

@@ -366,8 +366,7 @@ func (l *Logger) run() {
defer ticker.Stop() defer ticker.Stop()
writeEntry := func(entry logEntry) { writeEntry := func(entry logEntry) {
timestamp := time.Now().Format("2006-01-02 15:04:05.000") fmt.Fprintf(l.writer, "%s\n", entry.msg)
fmt.Fprintf(l.writer, "[%s] %s\n", timestamp, entry.msg)
// Cache error/warn entries in memory for fast extraction // Cache error/warn entries in memory for fast extraction
if entry.isError { if entry.isError {

View File

@@ -14,9 +14,9 @@ import (
) )
const ( const (
version = "5.2.8" version = "5.2.5"
defaultWorkdir = "." defaultWorkdir = "."
defaultTimeout = 7200 // seconds (2 hours) defaultTimeout = 7200 // seconds
codexLogLineLimit = 1000 codexLogLineLimit = 1000
stdinSpecialChars = "\n\\\"'`$" stdinSpecialChars = "\n\\\"'`$"
stderrCaptureLimit = 4 * 1024 stderrCaptureLimit = 4 * 1024

View File

@@ -255,10 +255,6 @@ func (d *drainBlockingCmd) SetDir(dir string) {
d.inner.SetDir(dir) d.inner.SetDir(dir)
} }
func (d *drainBlockingCmd) SetEnv(env map[string]string) {
d.inner.SetEnv(env)
}
func (d *drainBlockingCmd) Process() processHandle { func (d *drainBlockingCmd) Process() processHandle {
return d.inner.Process() return d.inner.Process()
} }
@@ -391,8 +387,6 @@ type fakeCmd struct {
stderr io.Writer stderr io.Writer
env map[string]string
waitDelay time.Duration waitDelay time.Duration
waitErr error waitErr error
startErr error startErr error
@@ -517,20 +511,6 @@ func (f *fakeCmd) SetStderr(w io.Writer) {
func (f *fakeCmd) SetDir(string) {} func (f *fakeCmd) SetDir(string) {}
func (f *fakeCmd) SetEnv(env map[string]string) {
if len(env) == 0 {
return
}
f.mu.Lock()
defer f.mu.Unlock()
if f.env == nil {
f.env = make(map[string]string, len(env))
}
for k, v := range env {
f.env[k] = v
}
}
func (f *fakeCmd) Process() processHandle { func (f *fakeCmd) Process() processHandle {
if f == nil { if f == nil {
return nil return nil
@@ -899,79 +879,6 @@ func TestRunCodexTask_ContextTimeout(t *testing.T) {
} }
} }
func TestRunCodexTask_ForcesStopAfterCompletion(t *testing.T) {
defer resetTestHooks()
forceKillDelay.Store(0)
fake := newFakeCmd(fakeCmdConfig{
StdoutPlan: []fakeStdoutEvent{
{Data: `{"type":"item.completed","item":{"type":"agent_message","text":"done"}}` + "\n"},
{Data: `{"type":"thread.completed","thread_id":"tid"}` + "\n"},
},
KeepStdoutOpen: true,
BlockWait: true,
ReleaseWaitOnSignal: true,
ReleaseWaitOnKill: true,
})
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
return fake
}
buildCodexArgsFn = func(cfg *Config, targetArg string) []string { return []string{targetArg} }
codexCommand = "fake-cmd"
start := time.Now()
result := runCodexTaskWithContext(context.Background(), TaskSpec{Task: "done", WorkDir: defaultWorkdir}, nil, nil, false, false, 60)
duration := time.Since(start)
if result.ExitCode != 0 || result.Message != "done" {
t.Fatalf("unexpected result: %+v", result)
}
if duration > 2*time.Second {
t.Fatalf("runCodexTaskWithContext took too long: %v", duration)
}
if fake.process.SignalCount() == 0 {
t.Fatalf("expected SIGTERM to be sent, got %d", fake.process.SignalCount())
}
}
func TestRunCodexTask_DoesNotTerminateBeforeThreadCompleted(t *testing.T) {
defer resetTestHooks()
forceKillDelay.Store(0)
fake := newFakeCmd(fakeCmdConfig{
StdoutPlan: []fakeStdoutEvent{
{Data: `{"type":"item.completed","item":{"type":"agent_message","text":"intermediate"}}` + "\n"},
{Delay: 1100 * time.Millisecond, Data: `{"type":"item.completed","item":{"type":"agent_message","text":"final"}}` + "\n"},
{Data: `{"type":"thread.completed","thread_id":"tid"}` + "\n"},
},
KeepStdoutOpen: true,
BlockWait: true,
ReleaseWaitOnSignal: true,
ReleaseWaitOnKill: true,
})
newCommandRunner = func(ctx context.Context, name string, args ...string) commandRunner {
return fake
}
buildCodexArgsFn = func(cfg *Config, targetArg string) []string { return []string{targetArg} }
codexCommand = "fake-cmd"
start := time.Now()
result := runCodexTaskWithContext(context.Background(), TaskSpec{Task: "done", WorkDir: defaultWorkdir}, nil, nil, false, false, 60)
duration := time.Since(start)
if result.ExitCode != 0 || result.Message != "final" {
t.Fatalf("unexpected result: %+v", result)
}
if duration > 5*time.Second {
t.Fatalf("runCodexTaskWithContext took too long: %v", duration)
}
if fake.process.SignalCount() == 0 {
t.Fatalf("expected SIGTERM to be sent, got %d", fake.process.SignalCount())
}
}
func TestBackendParseArgs_NewMode(t *testing.T) { func TestBackendParseArgs_NewMode(t *testing.T) {
tests := []struct { tests := []struct {
name string name string
@@ -1058,8 +965,6 @@ func TestBackendParseArgs_ResumeMode(t *testing.T) {
}, },
{name: "resume missing session_id", args: []string{"codeagent-wrapper", "resume"}, wantErr: true}, {name: "resume missing session_id", args: []string{"codeagent-wrapper", "resume"}, wantErr: true},
{name: "resume missing task", args: []string{"codeagent-wrapper", "resume", "session-123"}, wantErr: true}, {name: "resume missing task", args: []string{"codeagent-wrapper", "resume", "session-123"}, wantErr: true},
{name: "resume empty session_id", args: []string{"codeagent-wrapper", "resume", "", "task"}, wantErr: true},
{name: "resume whitespace session_id", args: []string{"codeagent-wrapper", "resume", " ", "task"}, wantErr: true},
} }
for _, tt := range tests { for _, tt := range tests {
@@ -1276,18 +1181,6 @@ do something`
} }
} }
func TestParallelParseConfig_EmptySessionID(t *testing.T) {
input := `---TASK---
id: task-1
session_id:
---CONTENT---
do something`
if _, err := parseParallelConfig([]byte(input)); err == nil {
t.Fatalf("expected error for empty session_id, got nil")
}
}
func TestParallelParseConfig_InvalidFormat(t *testing.T) { func TestParallelParseConfig_InvalidFormat(t *testing.T) {
if _, err := parseParallelConfig([]byte("invalid format")); err == nil { if _, err := parseParallelConfig([]byte("invalid format")); err == nil {
t.Fatalf("expected error for invalid format, got nil") t.Fatalf("expected error for invalid format, got nil")
@@ -1388,19 +1281,9 @@ func TestRunShouldUseStdin(t *testing.T) {
} }
func TestRunBuildCodexArgs_NewMode(t *testing.T) { func TestRunBuildCodexArgs_NewMode(t *testing.T) {
const key = "CODEX_BYPASS_SANDBOX"
t.Cleanup(func() { os.Unsetenv(key) })
os.Unsetenv(key)
cfg := &Config{Mode: "new", WorkDir: "/test/dir"} cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
args := buildCodexArgs(cfg, "my task") args := buildCodexArgs(cfg, "my task")
expected := []string{ expected := []string{"e", "--skip-git-repo-check", "-C", "/test/dir", "--json", "my task"}
"e",
"--skip-git-repo-check",
"-C", "/test/dir",
"--json",
"my task",
}
if len(args) != len(expected) { if len(args) != len(expected) {
t.Fatalf("len mismatch") t.Fatalf("len mismatch")
} }
@@ -1412,20 +1295,9 @@ func TestRunBuildCodexArgs_NewMode(t *testing.T) {
} }
func TestRunBuildCodexArgs_ResumeMode(t *testing.T) { func TestRunBuildCodexArgs_ResumeMode(t *testing.T) {
const key = "CODEX_BYPASS_SANDBOX"
t.Cleanup(func() { os.Unsetenv(key) })
os.Unsetenv(key)
cfg := &Config{Mode: "resume", SessionID: "session-abc"} cfg := &Config{Mode: "resume", SessionID: "session-abc"}
args := buildCodexArgs(cfg, "-") args := buildCodexArgs(cfg, "-")
expected := []string{ expected := []string{"e", "--skip-git-repo-check", "--json", "resume", "session-abc", "-"}
"e",
"--skip-git-repo-check",
"--json",
"resume",
"session-abc",
"-",
}
if len(args) != len(expected) { if len(args) != len(expected) {
t.Fatalf("len mismatch") t.Fatalf("len mismatch")
} }
@@ -1436,61 +1308,6 @@ func TestRunBuildCodexArgs_ResumeMode(t *testing.T) {
} }
} }
func TestRunBuildCodexArgs_ResumeMode_EmptySessionHandledGracefully(t *testing.T) {
const key = "CODEX_BYPASS_SANDBOX"
t.Cleanup(func() { os.Unsetenv(key) })
os.Unsetenv(key)
cfg := &Config{Mode: "resume", SessionID: " ", WorkDir: "/test/dir"}
args := buildCodexArgs(cfg, "task")
expected := []string{"e", "--skip-git-repo-check", "-C", "/test/dir", "--json", "task"}
if len(args) != len(expected) {
t.Fatalf("len mismatch")
}
for i := range args {
if args[i] != expected[i] {
t.Fatalf("args[%d]=%s, want %s", i, args[i], expected[i])
}
}
}
func TestRunBuildCodexArgs_BypassSandboxEnvTrue(t *testing.T) {
defer resetTestHooks()
tempDir := t.TempDir()
t.Setenv("TMPDIR", tempDir)
logger, err := NewLogger()
if err != nil {
t.Fatalf("NewLogger() error = %v", err)
}
setLogger(logger)
defer closeLogger()
t.Setenv("CODEX_BYPASS_SANDBOX", "true")
cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
args := buildCodexArgs(cfg, "my task")
found := false
for _, arg := range args {
if arg == "--dangerously-bypass-approvals-and-sandbox" {
found = true
break
}
}
if !found {
t.Fatalf("expected bypass flag in args, got %v", args)
}
logger.Flush()
data, err := os.ReadFile(logger.Path())
if err != nil {
t.Fatalf("failed to read log file: %v", err)
}
if !strings.Contains(string(data), "CODEX_BYPASS_SANDBOX=true") {
t.Fatalf("expected bypass warning log, got: %s", string(data))
}
}
func TestBackendSelectBackend(t *testing.T) { func TestBackendSelectBackend(t *testing.T) {
tests := []struct { tests := []struct {
name string name string
@@ -1546,13 +1363,7 @@ func TestBackendBuildArgs_CodexBackend(t *testing.T) {
backend := CodexBackend{} backend := CodexBackend{}
cfg := &Config{Mode: "new", WorkDir: "/test/dir"} cfg := &Config{Mode: "new", WorkDir: "/test/dir"}
got := backend.BuildArgs(cfg, "task") got := backend.BuildArgs(cfg, "task")
want := []string{ want := []string{"e", "--skip-git-repo-check", "-C", "/test/dir", "--json", "task"}
"e",
"--skip-git-repo-check",
"-C", "/test/dir",
"--json",
"task",
}
if len(got) != len(want) { if len(got) != len(want) {
t.Fatalf("length mismatch") t.Fatalf("length mismatch")
} }
@@ -1567,13 +1378,13 @@ func TestBackendBuildArgs_ClaudeBackend(t *testing.T) {
backend := ClaudeBackend{} backend := ClaudeBackend{}
cfg := &Config{Mode: "new", WorkDir: defaultWorkdir} cfg := &Config{Mode: "new", WorkDir: defaultWorkdir}
got := backend.BuildArgs(cfg, "todo") got := backend.BuildArgs(cfg, "todo")
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"} want := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose", "todo"}
if len(got) != len(want) { if len(got) != len(want) {
t.Fatalf("args length=%d, want %d: %v", len(got), len(want), got) t.Fatalf("length mismatch")
} }
for i := range want { for i := range want {
if got[i] != want[i] { if got[i] != want[i] {
t.Fatalf("index %d got %q want %q (args=%v)", i, got[i], want[i], got) t.Fatalf("index %d got %s want %s", i, got[i], want[i])
} }
} }
@@ -1588,15 +1399,19 @@ func TestClaudeBackendBuildArgs_OutputValidation(t *testing.T) {
target := "ensure-flags" target := "ensure-flags"
args := backend.BuildArgs(cfg, target) args := backend.BuildArgs(cfg, target)
want := []string{"-p", "--setting-sources", "", "--output-format", "stream-json", "--verbose", target} expectedPrefix := []string{"-p", "--dangerously-skip-permissions", "--setting-sources", "", "--output-format", "stream-json", "--verbose"}
if len(args) != len(want) {
t.Fatalf("args length=%d, want %d: %v", len(args), len(want), args) if len(args) != len(expectedPrefix)+1 {
t.Fatalf("args length=%d, want %d", len(args), len(expectedPrefix)+1)
} }
for i := range want { for i, val := range expectedPrefix {
if args[i] != want[i] { if args[i] != val {
t.Fatalf("index %d got %q want %q (args=%v)", i, args[i], want[i], args) t.Fatalf("args[%d]=%q, want %q", i, args[i], val)
} }
} }
if args[len(args)-1] != target {
t.Fatalf("last arg=%q, want target %q", args[len(args)-1], target)
}
} }
func TestBackendBuildArgs_GeminiBackend(t *testing.T) { func TestBackendBuildArgs_GeminiBackend(t *testing.T) {
@@ -1835,7 +1650,7 @@ func TestBackendParseJSONStream_GeminiEvents_OnMessageTriggeredOnStatus(t *testi
var called int var called int
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() { message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
called++ called++
}, nil) })
if message != "Hi there" { if message != "Hi there" {
t.Fatalf("message=%q, want %q", message, "Hi there") t.Fatalf("message=%q, want %q", message, "Hi there")
@@ -1864,7 +1679,7 @@ func TestBackendParseJSONStream_OnMessage(t *testing.T) {
var called int var called int
message, threadID := parseJSONStreamInternal(strings.NewReader(`{"type":"item.completed","item":{"type":"agent_message","text":"hook"}}`), nil, nil, func() { message, threadID := parseJSONStreamInternal(strings.NewReader(`{"type":"item.completed","item":{"type":"agent_message","text":"hook"}}`), nil, nil, func() {
called++ called++
}, nil) })
if message != "hook" { if message != "hook" {
t.Fatalf("message = %q, want hook", message) t.Fatalf("message = %q, want hook", message)
} }
@@ -1876,86 +1691,10 @@ func TestBackendParseJSONStream_OnMessage(t *testing.T) {
} }
} }
func TestBackendParseJSONStream_OnComplete_CodexThreadCompleted(t *testing.T) {
input := `{"type":"item.completed","item":{"type":"agent_message","text":"first"}}` + "\n" +
`{"type":"item.completed","item":{"type":"agent_message","text":"second"}}` + "\n" +
`{"type":"thread.completed","thread_id":"t-1"}`
var onMessageCalls int
var onCompleteCalls int
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
onMessageCalls++
}, func() {
onCompleteCalls++
})
if message != "second" {
t.Fatalf("message = %q, want second", message)
}
if threadID != "t-1" {
t.Fatalf("threadID = %q, want t-1", threadID)
}
if onMessageCalls != 2 {
t.Fatalf("onMessage calls = %d, want 2", onMessageCalls)
}
if onCompleteCalls != 1 {
t.Fatalf("onComplete calls = %d, want 1", onCompleteCalls)
}
}
func TestBackendParseJSONStream_OnComplete_ClaudeResult(t *testing.T) {
input := `{"type":"message","subtype":"stream","session_id":"s-1"}` + "\n" +
`{"type":"result","result":"OK","session_id":"s-1"}`
var onMessageCalls int
var onCompleteCalls int
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
onMessageCalls++
}, func() {
onCompleteCalls++
})
if message != "OK" {
t.Fatalf("message = %q, want OK", message)
}
if threadID != "s-1" {
t.Fatalf("threadID = %q, want s-1", threadID)
}
if onMessageCalls != 1 {
t.Fatalf("onMessage calls = %d, want 1", onMessageCalls)
}
if onCompleteCalls != 1 {
t.Fatalf("onComplete calls = %d, want 1", onCompleteCalls)
}
}
func TestBackendParseJSONStream_OnComplete_GeminiTerminalResultStatus(t *testing.T) {
input := `{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"g-1"}` + "\n" +
`{"type":"result","status":"success","session_id":"g-1"}`
var onMessageCalls int
var onCompleteCalls int
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
onMessageCalls++
}, func() {
onCompleteCalls++
})
if message != "Hi" {
t.Fatalf("message = %q, want Hi", message)
}
if threadID != "g-1" {
t.Fatalf("threadID = %q, want g-1", threadID)
}
if onMessageCalls != 1 {
t.Fatalf("onMessage calls = %d, want 1", onMessageCalls)
}
if onCompleteCalls != 1 {
t.Fatalf("onComplete calls = %d, want 1", onCompleteCalls)
}
}
func TestBackendParseJSONStream_ScannerError(t *testing.T) { func TestBackendParseJSONStream_ScannerError(t *testing.T) {
var warnings []string var warnings []string
warnFn := func(msg string) { warnings = append(warnings, msg) } warnFn := func(msg string) { warnings = append(warnings, msg) }
message, threadID := parseJSONStreamInternal(errReader{err: errors.New("scan-fail")}, warnFn, nil, nil, nil) message, threadID := parseJSONStreamInternal(errReader{err: errors.New("scan-fail")}, warnFn, nil, nil)
if message != "" || threadID != "" { if message != "" || threadID != "" {
t.Fatalf("expected empty output on scanner error, got message=%q threadID=%q", message, threadID) t.Fatalf("expected empty output on scanner error, got message=%q threadID=%q", message, threadID)
} }
@@ -3017,7 +2756,7 @@ func TestVersionFlag(t *testing.T) {
t.Errorf("exit = %d, want 0", code) t.Errorf("exit = %d, want 0", code)
} }
}) })
want := "codeagent-wrapper version 5.2.8\n" want := "codeagent-wrapper version 5.2.5\n"
if output != want { if output != want {
t.Fatalf("output = %q, want %q", output, want) t.Fatalf("output = %q, want %q", output, want)
} }
@@ -3031,7 +2770,7 @@ func TestVersionShortFlag(t *testing.T) {
t.Errorf("exit = %d, want 0", code) t.Errorf("exit = %d, want 0", code)
} }
}) })
want := "codeagent-wrapper version 5.2.8\n" want := "codeagent-wrapper version 5.2.5\n"
if output != want { if output != want {
t.Fatalf("output = %q, want %q", output, want) t.Fatalf("output = %q, want %q", output, want)
} }
@@ -3045,7 +2784,7 @@ func TestVersionLegacyAlias(t *testing.T) {
t.Errorf("exit = %d, want 0", code) t.Errorf("exit = %d, want 0", code)
} }
}) })
want := "codex-wrapper version 5.2.8\n" want := "codex-wrapper version 5.2.5\n"
if output != want { if output != want {
t.Fatalf("output = %q, want %q", output, want) t.Fatalf("output = %q, want %q", output, want)
} }

View File

@@ -50,7 +50,7 @@ func parseJSONStreamWithWarn(r io.Reader, warnFn func(string)) (message, threadI
} }
func parseJSONStreamWithLog(r io.Reader, warnFn func(string), infoFn func(string)) (message, threadID string) { func parseJSONStreamWithLog(r io.Reader, warnFn func(string), infoFn func(string)) (message, threadID string) {
return parseJSONStreamInternal(r, warnFn, infoFn, nil, nil) return parseJSONStreamInternal(r, warnFn, infoFn, nil)
} }
const ( const (
@@ -95,7 +95,7 @@ type ItemContent struct {
Text interface{} `json:"text"` Text interface{} `json:"text"`
} }
func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(string), onMessage func(), onComplete func()) (message, threadID string) { func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(string), onMessage func()) (message, threadID string) {
reader := bufio.NewReaderSize(r, jsonLineReaderSize) reader := bufio.NewReaderSize(r, jsonLineReaderSize)
if warnFn == nil { if warnFn == nil {
@@ -111,12 +111,6 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
} }
} }
notifyComplete := func() {
if onComplete != nil {
onComplete()
}
}
totalEvents := 0 totalEvents := 0
var ( var (
@@ -164,9 +158,6 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
} }
} }
isClaude := event.Subtype != "" || event.Result != "" isClaude := event.Subtype != "" || event.Result != ""
if !isClaude && event.Type == "result" && event.SessionID != "" && event.Status == "" {
isClaude = true
}
isGemini := event.Role != "" || event.Delta != nil || event.Status != "" isGemini := event.Role != "" || event.Delta != nil || event.Status != ""
// Handle Codex events // Handle Codex events
@@ -187,13 +178,6 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
threadID = event.ThreadID threadID = event.ThreadID
infoFn(fmt.Sprintf("thread.started event thread_id=%s", threadID)) infoFn(fmt.Sprintf("thread.started event thread_id=%s", threadID))
case "thread.completed":
if event.ThreadID != "" && threadID == "" {
threadID = event.ThreadID
}
infoFn(fmt.Sprintf("thread.completed event thread_id=%s", event.ThreadID))
notifyComplete()
case "item.completed": case "item.completed":
var itemType string var itemType string
if len(event.Item) > 0 { if len(event.Item) > 0 {
@@ -237,10 +221,6 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
claudeMessage = event.Result claudeMessage = event.Result
notifyMessage() notifyMessage()
} }
if event.Type == "result" {
notifyComplete()
}
continue continue
} }
@@ -256,10 +236,6 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
if event.Status != "" { if event.Status != "" {
notifyMessage() notifyMessage()
if event.Type == "result" && (event.Status == "success" || event.Status == "error" || event.Status == "complete" || event.Status == "failed") {
notifyComplete()
}
} }
delta := false delta := false

View File

@@ -18,7 +18,7 @@ func TestParseJSONStream_SkipsOverlongLineAndContinues(t *testing.T) {
var warns []string var warns []string
warnFn := func(msg string) { warns = append(warns, msg) } warnFn := func(msg string) { warns = append(warns, msg) }
gotMessage, gotThreadID := parseJSONStreamInternal(strings.NewReader(input), warnFn, nil, nil, nil) gotMessage, gotThreadID := parseJSONStreamInternal(strings.NewReader(input), warnFn, nil, nil)
if gotMessage != "ok" { if gotMessage != "ok" {
t.Fatalf("message=%q, want %q (warns=%v)", gotMessage, "ok", warns) t.Fatalf("message=%q, want %q (warns=%v)", gotMessage, "ok", warns)
} }

View File

@@ -2,25 +2,9 @@
description: Extreme lightweight end-to-end development workflow with requirements clarification, parallel codeagent execution, and mandatory 90% test coverage description: Extreme lightweight end-to-end development workflow with requirements clarification, parallel codeagent execution, and mandatory 90% test coverage
--- ---
You are the /dev Workflow Orchestrator, an expert development workflow manager specializing in orchestrating minimal, efficient end-to-end development processes with parallel task execution and rigorous test coverage validation. You are the /dev Workflow Orchestrator, an expert development workflow manager specializing in orchestrating minimal, efficient end-to-end development processes with parallel task execution and rigorous test coverage validation.
---
## CRITICAL CONSTRAINTS (NEVER VIOLATE)
These rules have HIGHEST PRIORITY and override all other instructions:
1. **NEVER use Edit, Write, or MultiEdit tools directly** - ALL code changes MUST go through codeagent-wrapper
2. **MUST use AskUserQuestion in Step 1** - Do NOT skip requirement clarification
3. **MUST use TodoWrite after Step 1** - Create task tracking list before any analysis
4. **MUST use codeagent-wrapper for Step 2 analysis** - Do NOT use Read/Glob/Grep directly for deep analysis
5. **MUST wait for user confirmation in Step 3** - Do NOT proceed to Step 4 without explicit approval
6. **MUST invoke codeagent-wrapper --parallel for Step 4 execution** - Use Bash tool, NOT Edit/Write or Task tool
**Violation of any constraint above invalidates the entire workflow. Stop and restart if violated.**
---
**Core Responsibilities** **Core Responsibilities**
- Orchestrate a streamlined 6-step development workflow: - Orchestrate a streamlined 6-step development workflow:
1. Requirement clarification through targeted questioning 1. Requirement clarification through targeted questioning
@@ -31,35 +15,14 @@ These rules have HIGHEST PRIORITY and override all other instructions:
6. Completion summary 6. Completion summary
**Workflow Execution** **Workflow Execution**
- **Step 1: Requirement Clarification [MANDATORY - DO NOT SKIP]** - **Step 1: Requirement Clarification**
- MUST use AskUserQuestion tool as the FIRST action - no exceptions - Use AskUserQuestion to clarify requirements directly
- Focus questions on functional boundaries, inputs/outputs, constraints, testing, and required unit-test coverage levels - Focus questions on functional boundaries, inputs/outputs, constraints, testing, and required unit-test coverage levels
- Iterate 2-3 rounds until clear; rely on judgment; keep questions concise - Iterate 2-3 rounds until clear; rely on judgment; keep questions concise
- After clarification complete: MUST use TodoWrite to create task tracking list with workflow steps
- **Step 2: codeagent-wrapper Deep Analysis (Plan Mode Style) [USE CODEAGENT-WRAPPER ONLY]** - **Step 2: codeagent Deep Analysis (Plan Mode Style)**
MUST use Bash tool to invoke `codeagent-wrapper` for deep analysis. Do NOT use Read/Glob/Grep tools directly - delegate all exploration to codeagent-wrapper. Use codeagent Skill to perform deep analysis. codeagent should operate in "plan mode" style and must include UI detection:
**How to invoke for analysis**:
```bash
codeagent-wrapper --backend codex - <<'EOF'
Analyze the codebase for implementing [feature name].
Requirements:
- [requirement 1]
- [requirement 2]
Deliverables:
1. Explore codebase structure and existing patterns
2. Evaluate implementation options with trade-offs
3. Make architectural decisions
4. Break down into 2-5 parallelizable tasks with dependencies
5. Determine if UI work is needed (check for .css/.tsx/.vue files)
Output the analysis following the structure below.
EOF
```
**When Deep Analysis is Needed** (any condition triggers): **When Deep Analysis is Needed** (any condition triggers):
- Multiple valid approaches exist (e.g., Redis vs in-memory vs file-based caching) - Multiple valid approaches exist (e.g., Redis vs in-memory vs file-based caching)
@@ -71,7 +34,7 @@ These rules have HIGHEST PRIORITY and override all other instructions:
- During analysis, output whether the task needs UI work (yes/no) and the evidence - During analysis, output whether the task needs UI work (yes/no) and the evidence
- UI criteria: presence of style assets (.css, .scss, styled-components, CSS modules, tailwindcss) OR frontend component files (.tsx, .jsx, .vue) - UI criteria: presence of style assets (.css, .scss, styled-components, CSS modules, tailwindcss) OR frontend component files (.tsx, .jsx, .vue)
**What the AI backend does in Analysis Mode** (when invoked via codeagent-wrapper): **What codeagent Does in Analysis Mode**:
1. **Explore Codebase**: Use Glob, Grep, Read to understand structure, patterns, architecture 1. **Explore Codebase**: Use Glob, Grep, Read to understand structure, patterns, architecture
2. **Identify Existing Patterns**: Find how similar features are implemented, reuse conventions 2. **Identify Existing Patterns**: Find how similar features are implemented, reuse conventions
3. **Evaluate Options**: When multiple approaches exist, list trade-offs (complexity, performance, security, maintainability) 3. **Evaluate Options**: When multiple approaches exist, list trade-offs (complexity, performance, security, maintainability)
@@ -118,39 +81,27 @@ These rules have HIGHEST PRIORITY and override all other instructions:
- Options: "Confirm and execute" / "Need adjustments" - Options: "Confirm and execute" / "Need adjustments"
- If user chooses "Need adjustments", return to Step 1 or Step 2 based on feedback - If user chooses "Need adjustments", return to Step 1 or Step 2 based on feedback
- **Step 4: Parallel Development Execution [CODEAGENT-WRAPPER ONLY - NO DIRECT EDITS]** - **Step 4: Parallel Development Execution**
- MUST use Bash tool to invoke `codeagent-wrapper --parallel` for ALL code changes - For each task in `dev-plan.md`, invoke codeagent skill with task brief in HEREDOC format:
- NEVER use Edit, Write, MultiEdit, or Task tools to modify code directly
- Build ONE `--parallel` config that includes all tasks in `dev-plan.md` and submit it once via Bash tool:
```bash ```bash
# One shot submission - wrapper handles topology + concurrency # Backend task (use codex backend - default)
codeagent-wrapper --parallel <<'EOF' codeagent-wrapper --backend codex - <<'EOF'
---TASK--- Task: [task-id]
id: [task-id-1]
backend: codex
workdir: .
dependencies: [optional, comma-separated ids]
---CONTENT---
Task: [task-id-1]
Reference: @.claude/specs/{feature_name}/dev-plan.md Reference: @.claude/specs/{feature_name}/dev-plan.md
Scope: [task file scope] Scope: [task file scope]
Test: [test command] Test: [test command]
Deliverables: code + unit tests + coverage ≥90% + coverage summary Deliverables: code + unit tests + coverage ≥90% + coverage summary
EOF
---TASK--- # UI task (use gemini backend - enforced)
id: [task-id-2] codeagent-wrapper --backend gemini - <<'EOF'
backend: gemini Task: [task-id]
workdir: .
dependencies: [optional, comma-separated ids]
---CONTENT---
Task: [task-id-2]
Reference: @.claude/specs/{feature_name}/dev-plan.md Reference: @.claude/specs/{feature_name}/dev-plan.md
Scope: [task file scope] Scope: [task file scope]
Test: [test command] Test: [test command]
Deliverables: code + unit tests + coverage ≥90% + coverage summary Deliverables: code + unit tests + coverage ≥90% + coverage summary
EOF EOF
``` ```
- **Note**: Use `workdir: .` (current directory) for all tasks unless specific subdirectory is required
- Execute independent tasks concurrently; serialize conflicting ones; track coverage reports - Execute independent tasks concurrently; serialize conflicting ones; track coverage reports
- **Step 5: Coverage Validation** - **Step 5: Coverage Validation**
@@ -162,13 +113,9 @@ These rules have HIGHEST PRIORITY and override all other instructions:
- Provide completed task list, coverage per task, key file changes - Provide completed task list, coverage per task, key file changes
**Error Handling** **Error Handling**
- **codeagent-wrapper failure**: Retry once with same input; if still fails, log error and ask user for guidance - codeagent failure: retry once, then log and continue
- **Insufficient coverage (<90%)**: Request more tests from the failed task (max 2 rounds); if still fails, report to user - Insufficient coverage: request more tests (max 2 rounds)
- **Dependency conflicts**: - Dependency conflicts: serialize automatically
- Circular dependencies: codeagent-wrapper will detect and fail with error; revise task breakdown to remove cycles
- Missing dependencies: Ensure all task IDs referenced in `dependencies` field exist
- **Parallel execution timeout**: Individual tasks timeout after 2 hours (configurable via CODEX_TIMEOUT); failed tasks can be retried individually
- **Backend unavailable**: If codex/claude/gemini CLI not found, fail immediately with clear error message
**Quality Standards** **Quality Standards**
- Code coverage ≥90% - Code coverage ≥90%

View File

@@ -1,5 +0,0 @@
go 1.21
use (
./codeagent-wrapper
)

View File

@@ -17,10 +17,7 @@ from datetime import datetime
from pathlib import Path from pathlib import Path
from typing import Any, Dict, Iterable, List, Optional from typing import Any, Dict, Iterable, List, Optional
try: import jsonschema
import jsonschema
except ImportError: # pragma: no cover
jsonschema = None
DEFAULT_INSTALL_DIR = "~/.claude" DEFAULT_INSTALL_DIR = "~/.claude"
@@ -90,32 +87,6 @@ def load_config(path: str) -> Dict[str, Any]:
config_path = Path(path).expanduser().resolve() config_path = Path(path).expanduser().resolve()
config = _load_json(config_path) config = _load_json(config_path)
if jsonschema is None:
print(
"WARNING: python package 'jsonschema' is not installed; "
"skipping config validation. To enable validation run:\n"
" python3 -m pip install jsonschema\n",
file=sys.stderr,
)
if not isinstance(config, dict):
raise ValueError(
f"Config must be a dict, got {type(config).__name__}. "
"Check your config.json syntax."
)
required_keys = ["version", "install_dir", "log_file", "modules"]
missing = [key for key in required_keys if key not in config]
if missing:
missing_str = ", ".join(missing)
raise ValueError(
f"Config missing required keys: {missing_str}. "
"Install jsonschema for better validation: "
"python3 -m pip install jsonschema"
)
return config
schema_candidates = [ schema_candidates = [
config_path.parent / "config.schema.json", config_path.parent / "config.schema.json",
Path(__file__).resolve().with_name("config.schema.json"), Path(__file__).resolve().with_name("config.schema.json"),

View File

@@ -34,25 +34,23 @@ if ! curl -fsSL "$URL" -o /tmp/codeagent-wrapper; then
exit 1 exit 1
fi fi
INSTALL_DIR="${INSTALL_DIR:-$HOME/.claude}" mkdir -p "$HOME/bin"
BIN_DIR="${INSTALL_DIR}/bin"
mkdir -p "$BIN_DIR"
mv /tmp/codeagent-wrapper "${BIN_DIR}/codeagent-wrapper" mv /tmp/codeagent-wrapper "$HOME/bin/codeagent-wrapper"
chmod +x "${BIN_DIR}/codeagent-wrapper" chmod +x "$HOME/bin/codeagent-wrapper"
if "${BIN_DIR}/codeagent-wrapper" --version >/dev/null 2>&1; then if "$HOME/bin/codeagent-wrapper" --version >/dev/null 2>&1; then
echo "codeagent-wrapper installed successfully to ${BIN_DIR}/codeagent-wrapper" echo "codeagent-wrapper installed successfully to ~/bin/codeagent-wrapper"
else else
echo "ERROR: installation verification failed" >&2 echo "ERROR: installation verification failed" >&2
exit 1 exit 1
fi fi
if [[ ":$PATH:" != *":${BIN_DIR}:"* ]]; then if [[ ":$PATH:" != *":$HOME/bin:"* ]]; then
echo "" echo ""
echo "WARNING: ${BIN_DIR} is not in your PATH" echo "WARNING: ~/bin is not in your PATH"
echo "Add this line to your ~/.bashrc or ~/.zshrc (then restart your shell):" echo "Add this line to your ~/.bashrc or ~/.zshrc:"
echo "" echo ""
echo " export PATH=\"${BIN_DIR}:\$PATH\"" echo " export PATH=\"\$HOME/bin:\$PATH\""
echo "" echo ""
fi fi

View File

@@ -74,7 +74,7 @@ codeagent-wrapper --backend gemini "simple task"
- `task` (required): Task description, supports `@file` references - `task` (required): Task description, supports `@file` references
- `working_dir` (optional): Working directory (default: current) - `working_dir` (optional): Working directory (default: current)
- `--backend` (optional): Select AI backend (codex/claude/gemini, default: codex) - `--backend` (optional): Select AI backend (codex/claude/gemini, default: codex)
- **Note**: Claude backend only adds `--dangerously-skip-permissions` when explicitly enabled - **Note**: Claude backend defaults to `--dangerously-skip-permissions` for automation compatibility
## Return Format ## Return Format
@@ -147,9 +147,9 @@ Set `CODEAGENT_MAX_PARALLEL_WORKERS` to limit concurrent tasks (default: unlimit
## Environment Variables ## Environment Variables
- `CODEX_TIMEOUT`: Override timeout in milliseconds (default: 7200000 = 2 hours) - `CODEX_TIMEOUT`: Override timeout in milliseconds (default: 7200000 = 2 hours)
- `CODEAGENT_SKIP_PERMISSIONS`: Control Claude CLI permission checks - `CODEAGENT_SKIP_PERMISSIONS`: Control permission checks
- For **Claude** backend: Set to `true`/`1` to add `--dangerously-skip-permissions` (default: disabled) - For **Claude** backend: Set to `true`/`1` to **disable** `--dangerously-skip-permissions` (default: enabled)
- For **Codex/Gemini** backends: Currently has no effect - For **Codex/Gemini** backends: Set to `true`/`1` to enable permission skipping (default: disabled)
- `CODEAGENT_MAX_PARALLEL_WORKERS`: Limit concurrent tasks in parallel mode (default: unlimited, recommended: 8) - `CODEAGENT_MAX_PARALLEL_WORKERS`: Limit concurrent tasks in parallel mode (default: unlimited, recommended: 8)
## Invocation Pattern ## Invocation Pattern
@@ -182,8 +182,9 @@ Bash tool parameters:
## Security Best Practices ## Security Best Practices
- **Claude Backend**: Permission checks enabled by default - **Claude Backend**: Defaults to `--dangerously-skip-permissions` for automation workflows
- To skip checks: set `CODEAGENT_SKIP_PERMISSIONS=true` or pass `--skip-permissions` - To enforce permission checks with Claude: Set `CODEAGENT_SKIP_PERMISSIONS=true`
- **Codex/Gemini Backends**: Permission checks enabled by default
- **Concurrency Limits**: Set `CODEAGENT_MAX_PARALLEL_WORKERS` in production to prevent resource exhaustion - **Concurrency Limits**: Set `CODEAGENT_MAX_PARALLEL_WORKERS` in production to prevent resource exhaustion
- **Automation Context**: This wrapper is designed for AI-driven automation where permission prompts would block execution - **Automation Context**: This wrapper is designed for AI-driven automation where permission prompts would block execution