mirror of
https://github.com/cexll/myclaude.git
synced 2026-02-09 03:09:30 +08:00
Compare commits
6 Commits
v5.2.5
...
feature_pa
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4dd735034e | ||
|
|
a08dd62b59 | ||
|
|
a30f434b5d | ||
|
|
41f4e21268 | ||
|
|
a67aa00c9a | ||
|
|
d61a0f9ffd |
22
.gitattributes
vendored
Normal file
22
.gitattributes
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
# Ensure shell scripts always use LF line endings on all platforms
|
||||||
|
*.sh text eol=lf
|
||||||
|
|
||||||
|
# Ensure Python files use LF line endings
|
||||||
|
*.py text eol=lf
|
||||||
|
|
||||||
|
# Auto-detect text files and normalize line endings to LF
|
||||||
|
* text=auto eol=lf
|
||||||
|
|
||||||
|
# Explicitly declare files that should always be treated as binary
|
||||||
|
*.exe binary
|
||||||
|
*.png binary
|
||||||
|
*.jpg binary
|
||||||
|
*.jpeg binary
|
||||||
|
*.gif binary
|
||||||
|
*.ico binary
|
||||||
|
*.mov binary
|
||||||
|
*.mp4 binary
|
||||||
|
*.mp3 binary
|
||||||
|
*.zip binary
|
||||||
|
*.gz binary
|
||||||
|
*.tar binary
|
||||||
@@ -7,7 +7,7 @@
|
|||||||
|
|
||||||
[](https://www.gnu.org/licenses/agpl-3.0)
|
[](https://www.gnu.org/licenses/agpl-3.0)
|
||||||
[](https://claude.ai/code)
|
[](https://claude.ai/code)
|
||||||
[](https://github.com/cexll/myclaude)
|
[](https://github.com/cexll/myclaude)
|
||||||
|
|
||||||
> AI-powered development automation with multi-backend execution (Codex/Claude/Gemini)
|
> AI-powered development automation with multi-backend execution (Codex/Claude/Gemini)
|
||||||
|
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
[](https://www.gnu.org/licenses/agpl-3.0)
|
[](https://www.gnu.org/licenses/agpl-3.0)
|
||||||
[](https://claude.ai/code)
|
[](https://claude.ai/code)
|
||||||
[](https://github.com/cexll/myclaude)
|
[](https://github.com/cexll/myclaude)
|
||||||
|
|
||||||
> AI 驱动的开发自动化 - 多后端执行架构 (Codex/Claude/Gemini)
|
> AI 驱动的开发自动化 - 多后端执行架构 (Codex/Claude/Gemini)
|
||||||
|
|
||||||
|
|||||||
@@ -427,6 +427,10 @@ Generate architecture document at `./.claude/specs/{feature_name}/02-system-arch
|
|||||||
|
|
||||||
## Important Behaviors
|
## Important Behaviors
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, REST, GraphQL, JWT, RBAC, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Start by reviewing and referencing the PRD
|
- Start by reviewing and referencing the PRD
|
||||||
- Present initial architecture based on requirements
|
- Present initial architecture based on requirements
|
||||||
|
|||||||
@@ -419,6 +419,10 @@ logger.info('User created', {
|
|||||||
|
|
||||||
## Important Implementation Rules
|
## Important Implementation Rules
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, CRUD, JWT, SQL, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Follow architecture specifications exactly
|
- Follow architecture specifications exactly
|
||||||
- Implement all acceptance criteria from PRD
|
- Implement all acceptance criteria from PRD
|
||||||
|
|||||||
@@ -22,6 +22,10 @@ You are the BMAD Orchestrator. Your core focus is repository analysis, workflow
|
|||||||
- Consistency: ensure conventions and patterns discovered in scan are preserved downstream
|
- Consistency: ensure conventions and patterns discovered in scan are preserved downstream
|
||||||
- Explicit handoffs: clearly document assumptions, risks, and integration points for other agents
|
- Explicit handoffs: clearly document assumptions, risks, and integration points for other agents
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, PRD, Sprint, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
## UltraThink Repository Scan
|
## UltraThink Repository Scan
|
||||||
|
|
||||||
When asked to analyze the repository, follow this structure and return a clear, actionable summary.
|
When asked to analyze the repository, follow this structure and return a clear, actionable summary.
|
||||||
|
|||||||
@@ -313,6 +313,10 @@ Generate PRD at `./.claude/specs/{feature_name}/01-product-requirements.md`:
|
|||||||
|
|
||||||
## Important Behaviors
|
## Important Behaviors
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, Sprint, PRD, KPI, MVP, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Start immediately with greeting and initial understanding
|
- Start immediately with greeting and initial understanding
|
||||||
- Show quality scores transparently
|
- Show quality scores transparently
|
||||||
|
|||||||
@@ -478,6 +478,10 @@ module.exports = {
|
|||||||
|
|
||||||
## Important Testing Rules
|
## Important Testing Rules
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, E2E, CI/CD, Mock, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Test all acceptance criteria from PRD
|
- Test all acceptance criteria from PRD
|
||||||
- Cover happy path, edge cases, and error scenarios
|
- Cover happy path, edge cases, and error scenarios
|
||||||
|
|||||||
@@ -45,3 +45,7 @@ You are an independent code review agent responsible for conducting reviews betw
|
|||||||
- Focus on actionable findings
|
- Focus on actionable findings
|
||||||
- Provide specific QA guidance
|
- Provide specific QA guidance
|
||||||
- Use clear, parseable output format
|
- Use clear, parseable output format
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, PRD, Sprint, etc.) in English; translate explanatory text only.
|
||||||
|
|||||||
@@ -351,6 +351,10 @@ So that [benefit]
|
|||||||
|
|
||||||
## Important Behaviors
|
## Important Behaviors
|
||||||
|
|
||||||
|
### Language Rules:
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (Sprint, Epic, Story, Backlog, Velocity, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### DO:
|
### DO:
|
||||||
- Read both PRD and Architecture documents thoroughly
|
- Read both PRD and Architecture documents thoroughly
|
||||||
- Create comprehensive task breakdown
|
- Create comprehensive task breakdown
|
||||||
|
|||||||
@@ -683,8 +683,17 @@ func runCodexTaskWithContext(parentCtx context.Context, taskSpec TaskSpec, backe
|
|||||||
if stderrLogger != nil {
|
if stderrLogger != nil {
|
||||||
stderrWriters = append(stderrWriters, stderrLogger)
|
stderrWriters = append(stderrWriters, stderrLogger)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// For gemini backend, filter noisy stderr output
|
||||||
|
var stderrFilter *filteringWriter
|
||||||
if !silent {
|
if !silent {
|
||||||
stderrWriters = append([]io.Writer{os.Stderr}, stderrWriters...)
|
stderrOut := io.Writer(os.Stderr)
|
||||||
|
if cfg.Backend == "gemini" {
|
||||||
|
stderrFilter = newFilteringWriter(os.Stderr, geminiNoisePatterns)
|
||||||
|
stderrOut = stderrFilter
|
||||||
|
defer stderrFilter.Flush()
|
||||||
|
}
|
||||||
|
stderrWriters = append([]io.Writer{stderrOut}, stderrWriters...)
|
||||||
}
|
}
|
||||||
if len(stderrWriters) == 1 {
|
if len(stderrWriters) == 1 {
|
||||||
cmd.SetStderr(stderrWriters[0])
|
cmd.SetStderr(stderrWriters[0])
|
||||||
|
|||||||
66
codeagent-wrapper/filter.go
Normal file
66
codeagent-wrapper/filter.go
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// geminiNoisePatterns contains stderr patterns to filter for gemini backend
|
||||||
|
var geminiNoisePatterns = []string{
|
||||||
|
"[STARTUP]",
|
||||||
|
"Session cleanup disabled",
|
||||||
|
"Warning:",
|
||||||
|
"(node:",
|
||||||
|
"(Use `node --trace-warnings",
|
||||||
|
"Loaded cached credentials",
|
||||||
|
"Loading extension:",
|
||||||
|
"YOLO mode is enabled",
|
||||||
|
}
|
||||||
|
|
||||||
|
// filteringWriter wraps an io.Writer and filters out lines matching patterns
|
||||||
|
type filteringWriter struct {
|
||||||
|
w io.Writer
|
||||||
|
patterns []string
|
||||||
|
buf bytes.Buffer
|
||||||
|
}
|
||||||
|
|
||||||
|
func newFilteringWriter(w io.Writer, patterns []string) *filteringWriter {
|
||||||
|
return &filteringWriter{w: w, patterns: patterns}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *filteringWriter) Write(p []byte) (n int, err error) {
|
||||||
|
f.buf.Write(p)
|
||||||
|
for {
|
||||||
|
line, err := f.buf.ReadString('\n')
|
||||||
|
if err != nil {
|
||||||
|
// incomplete line, put it back
|
||||||
|
f.buf.WriteString(line)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if !f.shouldFilter(line) {
|
||||||
|
f.w.Write([]byte(line))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return len(p), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *filteringWriter) shouldFilter(line string) bool {
|
||||||
|
for _, pattern := range f.patterns {
|
||||||
|
if strings.Contains(line, pattern) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Flush writes any remaining buffered content
|
||||||
|
func (f *filteringWriter) Flush() {
|
||||||
|
if f.buf.Len() > 0 {
|
||||||
|
remaining := f.buf.String()
|
||||||
|
if !f.shouldFilter(remaining) {
|
||||||
|
f.w.Write([]byte(remaining))
|
||||||
|
}
|
||||||
|
f.buf.Reset()
|
||||||
|
}
|
||||||
|
}
|
||||||
73
codeagent-wrapper/filter_test.go
Normal file
73
codeagent-wrapper/filter_test.go
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestFilteringWriter(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
patterns []string
|
||||||
|
input string
|
||||||
|
want string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "filter STARTUP lines",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "[STARTUP] Recording metric\nHello World\n[STARTUP] Another line\n",
|
||||||
|
want: "Hello World\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "filter Warning lines",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "Warning: something bad\nActual output\n",
|
||||||
|
want: "Actual output\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "filter multiple patterns",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "YOLO mode is enabled\nSession cleanup disabled\nReal content\nLoading extension: foo\n",
|
||||||
|
want: "Real content\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "no filtering needed",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "Line 1\nLine 2\nLine 3\n",
|
||||||
|
want: "Line 1\nLine 2\nLine 3\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "empty input",
|
||||||
|
patterns: geminiNoisePatterns,
|
||||||
|
input: "",
|
||||||
|
want: "",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
fw := newFilteringWriter(&buf, tt.patterns)
|
||||||
|
fw.Write([]byte(tt.input))
|
||||||
|
fw.Flush()
|
||||||
|
|
||||||
|
if got := buf.String(); got != tt.want {
|
||||||
|
t.Errorf("got %q, want %q", got, tt.want)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestFilteringWriterPartialLines(t *testing.T) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
fw := newFilteringWriter(&buf, geminiNoisePatterns)
|
||||||
|
|
||||||
|
// Write partial line
|
||||||
|
fw.Write([]byte("Hello "))
|
||||||
|
fw.Write([]byte("World\n"))
|
||||||
|
fw.Flush()
|
||||||
|
|
||||||
|
if got := buf.String(); got != "Hello World\n" {
|
||||||
|
t.Errorf("got %q, want %q", got, "Hello World\n")
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1582,6 +1582,34 @@ func TestBackendParseJSONStream_ClaudeEvents(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_ClaudeEvents_ItemDoesNotForceCodex(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
input string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "null item",
|
||||||
|
input: `{"type":"result","result":"OK","session_id":"abc123","item":null}`,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "empty object item",
|
||||||
|
input: `{"type":"result","subtype":"x","result":"OK","session_id":"abc123","item":{}}`,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
message, threadID := parseJSONStream(strings.NewReader(tt.input))
|
||||||
|
if message != "OK" {
|
||||||
|
t.Fatalf("message=%q, want %q", message, "OK")
|
||||||
|
}
|
||||||
|
if threadID != "abc123" {
|
||||||
|
t.Fatalf("threadID=%q, want %q", threadID, "abc123")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendParseJSONStream_GeminiEvents(t *testing.T) {
|
func TestBackendParseJSONStream_GeminiEvents(t *testing.T) {
|
||||||
input := `{"type":"init","session_id":"xyz789"}
|
input := `{"type":"init","session_id":"xyz789"}
|
||||||
{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"xyz789"}
|
{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"xyz789"}
|
||||||
@@ -1598,6 +1626,43 @@ func TestBackendParseJSONStream_GeminiEvents(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_GeminiEvents_DeltaFalseStillDetected(t *testing.T) {
|
||||||
|
input := `{"type":"init","session_id":"xyz789"}
|
||||||
|
{"type":"message","content":"Hi","delta":false,"session_id":"xyz789"}
|
||||||
|
{"type":"result","status":"success","session_id":"xyz789"}`
|
||||||
|
|
||||||
|
message, threadID := parseJSONStream(strings.NewReader(input))
|
||||||
|
|
||||||
|
if message != "Hi" {
|
||||||
|
t.Fatalf("message=%q, want %q", message, "Hi")
|
||||||
|
}
|
||||||
|
if threadID != "xyz789" {
|
||||||
|
t.Fatalf("threadID=%q, want %q", threadID, "xyz789")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestBackendParseJSONStream_GeminiEvents_OnMessageTriggeredOnStatus(t *testing.T) {
|
||||||
|
input := `{"type":"init","session_id":"xyz789"}
|
||||||
|
{"type":"message","role":"assistant","content":"Hi","delta":true,"session_id":"xyz789"}
|
||||||
|
{"type":"message","content":" there","delta":true}
|
||||||
|
{"type":"result","status":"success","session_id":"xyz789"}`
|
||||||
|
|
||||||
|
var called int
|
||||||
|
message, threadID := parseJSONStreamInternal(strings.NewReader(input), nil, nil, func() {
|
||||||
|
called++
|
||||||
|
})
|
||||||
|
|
||||||
|
if message != "Hi there" {
|
||||||
|
t.Fatalf("message=%q, want %q", message, "Hi there")
|
||||||
|
}
|
||||||
|
if threadID != "xyz789" {
|
||||||
|
t.Fatalf("threadID=%q, want %q", threadID, "xyz789")
|
||||||
|
}
|
||||||
|
if called != 1 {
|
||||||
|
t.Fatalf("onMessage called=%d, want %d", called, 1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestBackendParseJSONStreamWithWarn_InvalidLine(t *testing.T) {
|
func TestBackendParseJSONStreamWithWarn_InvalidLine(t *testing.T) {
|
||||||
var warnings []string
|
var warnings []string
|
||||||
warnFn := func(msg string) { warnings = append(warnings, msg) }
|
warnFn := func(msg string) { warnings = append(warnings, msg) }
|
||||||
|
|||||||
@@ -67,6 +67,34 @@ type codexHeader struct {
|
|||||||
} `json:"item,omitempty"`
|
} `json:"item,omitempty"`
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// UnifiedEvent combines all backend event formats into a single structure
|
||||||
|
// to avoid multiple JSON unmarshal operations per event
|
||||||
|
type UnifiedEvent struct {
|
||||||
|
// Common fields
|
||||||
|
Type string `json:"type"`
|
||||||
|
|
||||||
|
// Codex-specific fields
|
||||||
|
ThreadID string `json:"thread_id,omitempty"`
|
||||||
|
Item json.RawMessage `json:"item,omitempty"` // Lazy parse
|
||||||
|
|
||||||
|
// Claude-specific fields
|
||||||
|
Subtype string `json:"subtype,omitempty"`
|
||||||
|
SessionID string `json:"session_id,omitempty"`
|
||||||
|
Result string `json:"result,omitempty"`
|
||||||
|
|
||||||
|
// Gemini-specific fields
|
||||||
|
Role string `json:"role,omitempty"`
|
||||||
|
Content string `json:"content,omitempty"`
|
||||||
|
Delta *bool `json:"delta,omitempty"`
|
||||||
|
Status string `json:"status,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ItemContent represents the parsed item.text field for Codex events
|
||||||
|
type ItemContent struct {
|
||||||
|
Type string `json:"type"`
|
||||||
|
Text interface{} `json:"text"`
|
||||||
|
}
|
||||||
|
|
||||||
func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(string), onMessage func()) (message, threadID string) {
|
func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(string), onMessage func()) (message, threadID string) {
|
||||||
reader := bufio.NewReaderSize(r, jsonLineReaderSize)
|
reader := bufio.NewReaderSize(r, jsonLineReaderSize)
|
||||||
|
|
||||||
@@ -112,71 +140,77 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
|
|||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
var codex codexHeader
|
// Single unmarshal for all backend types
|
||||||
if err := json.Unmarshal(line, &codex); err == nil {
|
var event UnifiedEvent
|
||||||
isCodex := codex.ThreadID != "" || (codex.Item != nil && codex.Item.Type != "")
|
if err := json.Unmarshal(line, &event); err != nil {
|
||||||
if isCodex {
|
warnFn(fmt.Sprintf("Failed to parse event: %s", truncateBytes(line, 100)))
|
||||||
var details []string
|
continue
|
||||||
if codex.ThreadID != "" {
|
}
|
||||||
details = append(details, fmt.Sprintf("thread_id=%s", codex.ThreadID))
|
|
||||||
}
|
|
||||||
if codex.Item != nil && codex.Item.Type != "" {
|
|
||||||
details = append(details, fmt.Sprintf("item_type=%s", codex.Item.Type))
|
|
||||||
}
|
|
||||||
if len(details) > 0 {
|
|
||||||
infoFn(fmt.Sprintf("Parsed event #%d type=%s (%s)", totalEvents, codex.Type, strings.Join(details, ", ")))
|
|
||||||
} else {
|
|
||||||
infoFn(fmt.Sprintf("Parsed event #%d type=%s", totalEvents, codex.Type))
|
|
||||||
}
|
|
||||||
|
|
||||||
switch codex.Type {
|
// Detect backend type by field presence
|
||||||
case "thread.started":
|
isCodex := event.ThreadID != ""
|
||||||
threadID = codex.ThreadID
|
if !isCodex && len(event.Item) > 0 {
|
||||||
infoFn(fmt.Sprintf("thread.started event thread_id=%s", threadID))
|
var itemHeader struct {
|
||||||
case "item.completed":
|
Type string `json:"type"`
|
||||||
itemType := ""
|
}
|
||||||
if codex.Item != nil {
|
if json.Unmarshal(event.Item, &itemHeader) == nil && itemHeader.Type != "" {
|
||||||
itemType = codex.Item.Type
|
isCodex = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
isClaude := event.Subtype != "" || event.Result != ""
|
||||||
|
isGemini := event.Role != "" || event.Delta != nil || event.Status != ""
|
||||||
|
|
||||||
|
// Handle Codex events
|
||||||
|
if isCodex {
|
||||||
|
var details []string
|
||||||
|
if event.ThreadID != "" {
|
||||||
|
details = append(details, fmt.Sprintf("thread_id=%s", event.ThreadID))
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(details) > 0 {
|
||||||
|
infoFn(fmt.Sprintf("Parsed event #%d type=%s (%s)", totalEvents, event.Type, strings.Join(details, ", ")))
|
||||||
|
} else {
|
||||||
|
infoFn(fmt.Sprintf("Parsed event #%d type=%s", totalEvents, event.Type))
|
||||||
|
}
|
||||||
|
|
||||||
|
switch event.Type {
|
||||||
|
case "thread.started":
|
||||||
|
threadID = event.ThreadID
|
||||||
|
infoFn(fmt.Sprintf("thread.started event thread_id=%s", threadID))
|
||||||
|
|
||||||
|
case "item.completed":
|
||||||
|
var itemType string
|
||||||
|
if len(event.Item) > 0 {
|
||||||
|
var itemHeader struct {
|
||||||
|
Type string `json:"type"`
|
||||||
}
|
}
|
||||||
|
if err := json.Unmarshal(event.Item, &itemHeader); err == nil {
|
||||||
|
itemType = itemHeader.Type
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if itemType == "agent_message" {
|
if itemType == "agent_message" && len(event.Item) > 0 {
|
||||||
var event JSONEvent
|
// Lazy parse: only parse item content when needed
|
||||||
if err := json.Unmarshal(line, &event); err != nil {
|
var item ItemContent
|
||||||
warnFn(fmt.Sprintf("Failed to parse Codex event: %s", truncateBytes(line, 100)))
|
if err := json.Unmarshal(event.Item, &item); err == nil {
|
||||||
continue
|
normalized := normalizeText(item.Text)
|
||||||
}
|
|
||||||
|
|
||||||
normalized := ""
|
|
||||||
if event.Item != nil {
|
|
||||||
normalized = normalizeText(event.Item.Text)
|
|
||||||
}
|
|
||||||
infoFn(fmt.Sprintf("item.completed event item_type=%s message_len=%d", itemType, len(normalized)))
|
infoFn(fmt.Sprintf("item.completed event item_type=%s message_len=%d", itemType, len(normalized)))
|
||||||
if normalized != "" {
|
if normalized != "" {
|
||||||
codexMessage = normalized
|
codexMessage = normalized
|
||||||
notifyMessage()
|
notifyMessage()
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
infoFn(fmt.Sprintf("item.completed event item_type=%s", itemType))
|
warnFn(fmt.Sprintf("Failed to parse item content: %s", err.Error()))
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
infoFn(fmt.Sprintf("item.completed event item_type=%s", itemType))
|
||||||
}
|
}
|
||||||
continue
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
var raw map[string]json.RawMessage
|
|
||||||
if err := json.Unmarshal(line, &raw); err != nil {
|
|
||||||
warnFn(fmt.Sprintf("Failed to parse line: %s", truncateBytes(line, 100)))
|
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
switch {
|
// Handle Claude events
|
||||||
case hasKey(raw, "subtype") || hasKey(raw, "result"):
|
if isClaude {
|
||||||
var event ClaudeEvent
|
|
||||||
if err := json.Unmarshal(line, &event); err != nil {
|
|
||||||
warnFn(fmt.Sprintf("Failed to parse Claude event: %s", truncateBytes(line, 100)))
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if event.SessionID != "" && threadID == "" {
|
if event.SessionID != "" && threadID == "" {
|
||||||
threadID = event.SessionID
|
threadID = event.SessionID
|
||||||
}
|
}
|
||||||
@@ -187,28 +221,34 @@ func parseJSONStreamInternal(r io.Reader, warnFn func(string), infoFn func(strin
|
|||||||
claudeMessage = event.Result
|
claudeMessage = event.Result
|
||||||
notifyMessage()
|
notifyMessage()
|
||||||
}
|
}
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
case hasKey(raw, "role") || hasKey(raw, "delta"):
|
// Handle Gemini events
|
||||||
var event GeminiEvent
|
if isGemini {
|
||||||
if err := json.Unmarshal(line, &event); err != nil {
|
|
||||||
warnFn(fmt.Sprintf("Failed to parse Gemini event: %s", truncateBytes(line, 100)))
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if event.SessionID != "" && threadID == "" {
|
if event.SessionID != "" && threadID == "" {
|
||||||
threadID = event.SessionID
|
threadID = event.SessionID
|
||||||
}
|
}
|
||||||
|
|
||||||
if event.Content != "" {
|
if event.Content != "" {
|
||||||
geminiBuffer.WriteString(event.Content)
|
geminiBuffer.WriteString(event.Content)
|
||||||
|
}
|
||||||
|
|
||||||
|
if event.Status != "" {
|
||||||
notifyMessage()
|
notifyMessage()
|
||||||
}
|
}
|
||||||
|
|
||||||
infoFn(fmt.Sprintf("Parsed Gemini event #%d type=%s role=%s delta=%t status=%s content_len=%d", totalEvents, event.Type, event.Role, event.Delta, event.Status, len(event.Content)))
|
delta := false
|
||||||
|
if event.Delta != nil {
|
||||||
|
delta = *event.Delta
|
||||||
|
}
|
||||||
|
|
||||||
default:
|
infoFn(fmt.Sprintf("Parsed Gemini event #%d type=%s role=%s delta=%t status=%s content_len=%d", totalEvents, event.Type, event.Role, delta, event.Status, len(event.Content)))
|
||||||
warnFn(fmt.Sprintf("Unknown event format: %s", truncateBytes(line, 100)))
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Unknown event format
|
||||||
|
warnFn(fmt.Sprintf("Unknown event format: %s", truncateBytes(line, 100)))
|
||||||
}
|
}
|
||||||
|
|
||||||
switch {
|
switch {
|
||||||
|
|||||||
@@ -5,7 +5,7 @@
|
|||||||
set -e
|
set -e
|
||||||
|
|
||||||
# Get staged files
|
# Get staged files
|
||||||
STAGED_FILES=$(git diff --cached --name-only --diff-filter=ACM)
|
STAGED_FILES="$(git diff --cached --name-only --diff-filter=ACM)"
|
||||||
|
|
||||||
if [ -z "$STAGED_FILES" ]; then
|
if [ -z "$STAGED_FILES" ]; then
|
||||||
echo "No files to validate"
|
echo "No files to validate"
|
||||||
@@ -15,17 +15,32 @@ fi
|
|||||||
echo "Running pre-commit checks..."
|
echo "Running pre-commit checks..."
|
||||||
|
|
||||||
# Check Go files
|
# Check Go files
|
||||||
GO_FILES=$(echo "$STAGED_FILES" | grep '\.go$' || true)
|
GO_FILES="$(printf '%s\n' "$STAGED_FILES" | grep '\.go$' || true)"
|
||||||
if [ -n "$GO_FILES" ]; then
|
if [ -n "$GO_FILES" ]; then
|
||||||
echo "Checking Go files..."
|
echo "Checking Go files..."
|
||||||
|
|
||||||
|
if ! command -v gofmt &> /dev/null; then
|
||||||
|
echo "❌ gofmt not found. Please install Go (gofmt is included with the Go toolchain)."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
# Format check
|
# Format check
|
||||||
gofmt -l $GO_FILES | while read -r file; do
|
GO_FILE_ARGS=()
|
||||||
|
while IFS= read -r file; do
|
||||||
if [ -n "$file" ]; then
|
if [ -n "$file" ]; then
|
||||||
echo "❌ $file needs formatting (run: gofmt -w $file)"
|
GO_FILE_ARGS+=("$file")
|
||||||
|
fi
|
||||||
|
done <<< "$GO_FILES"
|
||||||
|
|
||||||
|
if [ "${#GO_FILE_ARGS[@]}" -gt 0 ]; then
|
||||||
|
UNFORMATTED="$(gofmt -l "${GO_FILE_ARGS[@]}")"
|
||||||
|
if [ -n "$UNFORMATTED" ]; then
|
||||||
|
echo "❌ The following files need formatting:"
|
||||||
|
echo "$UNFORMATTED"
|
||||||
|
echo "Run: gofmt -w <file>"
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
done
|
fi
|
||||||
|
|
||||||
# Run tests
|
# Run tests
|
||||||
if command -v go &> /dev/null; then
|
if command -v go &> /dev/null; then
|
||||||
@@ -38,19 +53,26 @@ if [ -n "$GO_FILES" ]; then
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
# Check JSON files
|
# Check JSON files
|
||||||
JSON_FILES=$(echo "$STAGED_FILES" | grep '\.json$' || true)
|
JSON_FILES="$(printf '%s\n' "$STAGED_FILES" | grep '\.json$' || true)"
|
||||||
if [ -n "$JSON_FILES" ]; then
|
if [ -n "$JSON_FILES" ]; then
|
||||||
echo "Validating JSON files..."
|
echo "Validating JSON files..."
|
||||||
for file in $JSON_FILES; do
|
if ! command -v jq &> /dev/null; then
|
||||||
|
echo "❌ jq not found. Please install jq to validate JSON files."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
while IFS= read -r file; do
|
||||||
|
if [ -z "$file" ]; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
if ! jq empty "$file" 2>/dev/null; then
|
if ! jq empty "$file" 2>/dev/null; then
|
||||||
echo "❌ Invalid JSON: $file"
|
echo "❌ Invalid JSON: $file"
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
done
|
done <<< "$JSON_FILES"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Check Markdown files
|
# Check Markdown files
|
||||||
MD_FILES=$(echo "$STAGED_FILES" | grep '\.md$' || true)
|
MD_FILES="$(printf '%s\n' "$STAGED_FILES" | grep '\.md$' || true)"
|
||||||
if [ -n "$MD_FILES" ]; then
|
if [ -n "$MD_FILES" ]; then
|
||||||
echo "Checking markdown files..."
|
echo "Checking markdown files..."
|
||||||
# Add markdown linting if needed
|
# Add markdown linting if needed
|
||||||
|
|||||||
15
install.sh
15
install.sh
@@ -1,12 +1,15 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
echo "⚠️ WARNING: install.sh is LEGACY and will be removed in future versions."
|
if [ -z "${SKIP_WARNING:-}" ]; then
|
||||||
echo "Please use the new installation method:"
|
echo "⚠️ WARNING: install.sh is LEGACY and will be removed in future versions."
|
||||||
echo " python3 install.py --install-dir ~/.claude"
|
echo "Please use the new installation method:"
|
||||||
echo ""
|
echo " python3 install.py --install-dir ~/.claude"
|
||||||
echo "Continuing with legacy installation in 5 seconds..."
|
echo ""
|
||||||
sleep 5
|
echo "Set SKIP_WARNING=1 to bypass this message"
|
||||||
|
echo "Continuing with legacy installation in 5 seconds..."
|
||||||
|
sleep 5
|
||||||
|
fi
|
||||||
|
|
||||||
# Detect platform
|
# Detect platform
|
||||||
OS=$(uname -s | tr '[:upper:]' '[:lower:]')
|
OS=$(uname -s | tr '[:upper:]' '[:lower:]')
|
||||||
|
|||||||
@@ -104,6 +104,10 @@ You adhere to core software engineering principles like KISS (Keep It Simple, St
|
|||||||
|
|
||||||
## Implementation Constraints
|
## Implementation Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, SQL, CRUD, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Working Solution**: Code must fully implement the specified functionality
|
- **Working Solution**: Code must fully implement the specified functionality
|
||||||
- **Integration Compatibility**: Must work seamlessly with existing codebase
|
- **Integration Compatibility**: Must work seamlessly with existing codebase
|
||||||
|
|||||||
@@ -88,6 +88,10 @@ Each phase should be independently deployable and testable.
|
|||||||
|
|
||||||
## Key Constraints
|
## Key Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, SQL, CRUD, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Direct Implementability**: Every item must be directly translatable to code
|
- **Direct Implementability**: Every item must be directly translatable to code
|
||||||
- **Specific Technical Details**: Include exact file paths, function names, table schemas
|
- **Specific Technical Details**: Include exact file paths, function names, table schemas
|
||||||
|
|||||||
@@ -176,6 +176,10 @@ You adhere to core software engineering principles like KISS (Keep It Simple, St
|
|||||||
|
|
||||||
## Key Constraints
|
## Key Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, E2E, CI/CD, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Functional Verification**: Verify all specified functionality works
|
- **Functional Verification**: Verify all specified functionality works
|
||||||
- **Integration Testing**: Ensure seamless integration with existing code
|
- **Integration Testing**: Ensure seamless integration with existing code
|
||||||
|
|||||||
@@ -199,6 +199,10 @@ func TestAPIEndpoint(t *testing.T) {
|
|||||||
|
|
||||||
## Key Constraints
|
## Key Constraints
|
||||||
|
|
||||||
|
### Language Rules
|
||||||
|
- **Language Matching**: Output language matches user input (Chinese input → Chinese doc, English input → English doc). When language is ambiguous, default to Chinese.
|
||||||
|
- **Technical Terms**: Keep technical terms (API, E2E, CI/CD, Mock, etc.) in English; translate explanatory text only.
|
||||||
|
|
||||||
### MUST Requirements
|
### MUST Requirements
|
||||||
- **Specification Coverage**: Must test all requirements from `./.claude/specs/{feature_name}/requirements-spec.md`
|
- **Specification Coverage**: Must test all requirements from `./.claude/specs/{feature_name}/requirements-spec.md`
|
||||||
- **Critical Path Testing**: Must test all critical business functionality
|
- **Critical Path Testing**: Must test all critical business functionality
|
||||||
|
|||||||
Reference in New Issue
Block a user