Compare commits

...

22 Commits

Author SHA1 Message Date
王性驊 f9a92b0bfa add refactor doc 2026-04-06 22:21:32 +08:00
王性驊 3dc49bfc7d fix: remove duplicate Verbose field and use raw JSON parsing for handlers
- Remove Verbose from Config struct (inherited from RestConf)
- Remove Verbose from yaml config to fix conflict
- Use raw JSON parsing in handlers for interface{} Content field
- Fix config tests
2026-04-04 12:49:32 +08:00
王性驊 9e2a10b614 feat: complete implementation of all endpoints and config
- Update etc/chat-api.yaml with all configuration options (no env vars)
- Add ToBridgeConfig method to Config for YAML-based config
- Implement complete AnthropicMessages streaming with SSE
- Add Tools field to AnthropicRequest
- Update ServiceContext with model tracking
- Add all CRUD handlers for health, models, chat, anthropic

Features restored:
- Health check endpoint
- Models list with caching
- ChatCompletions streaming (OpenAI format)
- AnthropicMessages streaming (Anthropic format)
- Tool calls support for both formats
- Thinking/reasoning content support
- Rate limit detection and handling
- Account pool integration
- Request/response logging
- Model resolution with strict mode
- Workspace and prompt truncation handling
2026-04-03 23:17:04 +08:00
王性驊 3387887fb9 feat(logic): implement complete ChatCompletions streaming with SSE
- Full SSE streaming support with proper event format
- Tool calls handling with marker detection
- Rate limit detection and reporting
- Account pool integration (round-robin, stats tracking)
- Request/response logging and traffic tracking
- Thinking/reasoning content support
- Model resolution with strict mode
- Workspace and prompt truncation handling
- Added X-Accel-Buffering header for proper SSE
- Pass method/path for logging context
2026-04-03 23:04:28 +08:00
王性驊 f90d72b279 feat(logic): implement logic layer for health, models, and chat completions
- HealthLogic: simple health check response
- ModelsLogic: list Cursor CLI models with caching and Anthropic aliases
- ChatCompletionsLogic: scaffold for OpenAI-format completions (streaming placeholder)
- AnthropicMessagesLogic: scaffold for Anthropic format (TODO)
- Update handler for SSE streaming support
- Add models.go with ListCursorCliModels and model mappings
2026-04-03 23:00:18 +08:00
王性驊 081f404f77 Task 9: Cleanup - remove old internal files, update import paths, add go-zero entry point
- Removed old files from internal/* (migrated to pkg/*)
- Removed old CLI files from cmd/ (now in cmd/cli/)
- Updated import paths (internal/* -> pkg/*)
- Rewrote main.go to support CLI commands + go-zero HTTP server
- Fixed AccountStat type references (use entity.AccountStat)
- Fixed cmd/cli/* to use usecase package instead of agent
- Fixed logger to use entity.AccountStat
- Fixed geminiweb fmt.Println redundant newlines
- Fixed scripts/detect-gemini-dom.go pointer format issues
2026-04-03 22:54:18 +08:00
王性驊 7e0b7a970c refactor(task-7): integrate all layers and fix type mismatches
- Merge all branches (domain, infrastructure, repository, provider, adapter, usecase, cli)
- Update ServiceContext to inject dependencies
- Fix Message.Content type mismatch (string vs interface{})
- Update AccountStat to use entity.AccountStat
- Add helper functions for content conversion
2026-04-03 17:49:11 +08:00
王性驊 8f1b7159ed merge: refactor/cli 2026-04-03 17:46:37 +08:00
王性驊 e379c79bd1 merge: refactor/usecase 2026-04-03 17:46:37 +08:00
王性驊 ef4b6519f5 merge: refactor/adapter 2026-04-03 17:46:36 +08:00
王性驊 7b82986dca merge: refactor/provider 2026-04-03 17:46:33 +08:00
王性驊 e5f19c243b merge: refactor/repository 2026-04-03 17:46:32 +08:00
王性驊 83418d5e76 merge: refactor/infrastructure 2026-04-03 17:46:32 +08:00
王性驊 5866a5b9c9 refactor(task-5): add usecase layer
- Migrate agent runner from internal/agent
- Migrate sanitizer from internal/sanitize
- Migrate toolcall from internal/toolcall
- Update import paths to use pkg/usecase
2026-04-03 17:38:16 +08:00
王性驊 270accfd75 refactor(task-4): migrate providers to pkg/provider
- Migrate cursor provider
- Migrate geminiweb provider (playwright-based)
- Update import paths to use pkg/domain/entity
2026-04-03 17:35:23 +08:00
王性驊 f9f3c5fb42 refactor(task-6): migrate adapters to pkg/adapter
- Migrate OpenAI adapter
- Migrate Anthropic adapter
- Update import paths
2026-04-03 17:33:53 +08:00
王性驊 d4fcb8d3b8 refactor(task-3): add repository layer implementation
- Migrate AccountPool from internal/pool
- Update import paths to use pkg/repository
2026-04-03 17:33:51 +08:00
王性驊 3a005ea02e refactor(task-8): migrate CLI tools to cmd/cli
- Migrate args.go, login.go, accounts.go
- Migrate resethwid.go, usage.go, sqlite.go
2026-04-03 17:22:52 +08:00
王性驊 80d7a4bb29 refactor(task-2): migrate infrastructure layer
- Migrate process (runner, kill_unix, kill_windows)
- Migrate parser (stream)
- Migrate httputil (httputil)
- Migrate logger (logger)
- Migrate env (env)
- Migrate workspace (workspace)
- Migrate winlimit (winlimit)
2026-04-03 17:22:51 +08:00
王性驊 294bd74a43 refactor(task-1): add domain layer with entities and interfaces
- Add entity types: Message, Tool, ToolCall, Chunk, Account
- Add repository interfaces: AccountPool, Provider
- Add usecase interfaces: ChatUsecase, AgentRunner
- Add model constants and error definitions
2026-04-03 17:22:48 +08:00
王性驊 8b6abbbba7 refactor(task-0): initialize go-zero project structure
- Add go-zero dependency to go.mod
- Create api/chat.api with OpenAI-compatible types
- Create etc/chat.yaml configuration
- Update Makefile with goctl commands
- Generate go-zero scaffold (types, handlers, logic, svc)
- Move chat.go to cmd/chat/chat.go
- Add Config struct for go-zero (keep BridgeConfig for backward compatibility)
2026-04-03 17:15:35 +08:00
王性驊 b18e3d82f0 add refactor doc 2026-04-03 10:17:17 +08:00
91 changed files with 4806 additions and 2668 deletions

View File

@ -1,40 +0,0 @@
# ──────────────────────────────────────────────────────────────
# cursor-api-proxy 設定範例
# 複製為 .env 後填入你的設定cp .env.example .env
# ──────────────────────────────────────────────────────────────
# ── 伺服器設定 ────────────────────────────────────────────────
# Docker 模式固定使用 0.0.0.0;本機直接執行時可改 127.0.0.1
CURSOR_BRIDGE_HOST=0.0.0.0
CURSOR_BRIDGE_PORT=8766
CURSOR_BRIDGE_API_KEY=
CURSOR_BRIDGE_TIMEOUT_MS=3600000
CURSOR_BRIDGE_MULTI_PORT=false
CURSOR_BRIDGE_VERBOSE=false
# ── Agent / 模型設定 ──────────────────────────────────────────
# Docker 模式:容器內 agent 路徑(預設掛載至 /usr/local/bin/agent
CURSOR_AGENT_BIN=/usr/local/bin/agent
CURSOR_BRIDGE_DEFAULT_MODEL=auto
CURSOR_BRIDGE_STRICT_MODEL=true
CURSOR_BRIDGE_MAX_MODE=false
CURSOR_BRIDGE_FORCE=false
CURSOR_BRIDGE_APPROVE_MCPS=false
# ── 工作區與帳號 ──────────────────────────────────────────────
CURSOR_BRIDGE_WORKSPACE=/workspace
CURSOR_BRIDGE_CHAT_ONLY_WORKSPACE=true
# 多帳號設定目錄(逗號分隔),留空則自動探索 ~/.cursor-api-proxy/accounts/
CURSOR_CONFIG_DIRS=
# ── TLS / HTTPS選用────────────────────────────────────────
CURSOR_BRIDGE_TLS_CERT=
CURSOR_BRIDGE_TLS_KEY=
# ── Docker 專用:宿主機路徑對映 ──────────────────────────────
# 宿主機上 Cursor agent 二進位檔的實際路徑
CURSOR_AGENT_HOST_BIN=/usr/local/bin/agent
# 宿主機上的帳號資料目錄(會掛載至容器的 /root/.cursor-api-proxy
CURSOR_ACCOUNTS_DIR=~/.cursor-api-proxy
# 宿主機上要掛載進容器的工作區目錄
WORKSPACE_DIR=/tmp/workspace

3
.gitignore vendored
View File

@ -1,3 +1,4 @@
.idea/ .idea/
cursor-api-proxy* cursor-api-proxy*
.env .env
.opencode

View File

@ -0,0 +1,462 @@
# REFACTOR TASKS
重構任務拆分,支援 git worktree 並行開發。
---
## Task Overview
### 並行策略
```
時間軸 ──────────────────────────────────────────────────────────────►
Task 0: Init (必須先完成)
├── Task 1: Domain Layer ─────────────────────────┐
│ │
│ ┌── Task 2: Infrastructure Layer ────────────┤── 並行
│ │ │
│ └── Task 3: Repository Layer ────────────────┘
│ (依賴 Task 1)
├── Task 4: Provider Layer ──────────────────────┐
│ (依賴 Task 1) │
│ │── 可並行
├── Task 5: Usecase Layer ───────────────────────┤
│ (依賴 Task 3) │
│ │
├── Task 6: Adapter Layer ───────────────────────┘
│ (依賴 Task 1)
├── Task 7: Internal Layer ──────────────────────┐
│ (整合所有,必須最後) │
│ │── 序列
├── Task 8: CLI Tools │
│ │
└── Task 9: Cleanup & Tests ────────────────────┘
```
### Worktree 分支規劃
| 分支名稱 | 基於 | 任務 | 可並行 |
|---------|------|------|--------|
| `refactor/init` | `master` | Task 0 | ❌ |
| `refactor/domain` | `refactor/init` | Task 1 | ✅ |
| `refactor/infrastructure` | `refactor/init` | Task 2 | ✅ |
| `refactor/repository` | `refactor/domain` | Task 3 | ✅ |
| `refactor/provider` | `refactor/domain` | Task 4 | ✅ |
| `refactor/usecase` | `refactor/repository` | Task 5 | ✅ |
| `refactor/adapter` | `refactor/domain` | Task 6 | ✅ |
| `refactor/internal` | 合併所有 | Task 7 | ❌ |
| `refactor/cli` | `refactor/init` | Task 8 | ✅ |
| `refactor/cleanup` | 合併所有 | Task 9 | ❌ |
---
## Task 0: 初始化
### 分支
`refactor/init`
### 依賴
無(必須先完成)
### 小任務
- [ ] **0.1** 更新 go.mod (5min)
- `go get github.com/zeromicro/go-zero@latest`
- `go mod tidy`
- [ ] **0.2** 建立目錄 (1min)
- `mkdir -p api etc`
- [ ] **0.3** 建立 `api/chat.api` (15min)
- 定義 API types
- 定義 routes
- [ ] **0.4** 建立 `etc/chat.yaml` (5min)
- 配置參數
- [ ] **0.5** 更新 Makefile (10min)
- 新增 goctl 命令
- [ ] **0.6** 提交 (2min)
**預估時間**: ~30min
---
## Task 1: Domain Layer
### 分支
`refactor/domain`
### 依賴
Task 0 完成
### 小任務
- [ ] **1.1** 建立目錄結構 (1min)
- `pkg/domain/entity`
- `pkg/domain/repository`
- `pkg/domain/usecase`
- `pkg/domain/const`
- [ ] **1.2** `entity/message.go` (10min)
- Message, Tool, ToolFunction, ToolCall
- [ ] **1.3** `entity/chunk.go` (5min)
- StreamChunk, ChunkType
- [ ] **1.4** `entity/account.go` (5min)
- Account, AccountStat
- [ ] **1.5** `repository/account.go` (10min)
- AccountPool interface
- [ ] **1.6** `repository/provider.go` (5min)
- Provider interface
- [ ] **1.7** `usecase/chat.go` (15min)
- ChatUsecase interface
- [ ] **1.8** `usecase/agent.go` (5min)
- AgentRunner interface
- [ ] **1.9** `const/models.go` (10min)
- Model 常數
- [ ] **1.10** `const/errors.go` (5min)
- 錯誤定義
- [ ] **1.11** 提交 (2min)
**預估時間**: ~2h
---
## Task 2: Infrastructure Layer
### 分支
`refactor/infrastructure`
### 依賴
Task 0 完成(可與 Task 1 並行)
### 小任務
- [ ] **2.1** 建立目錄 (2min)
- `pkg/infrastructure/{process,parser,httputil,logger,env,workspace,winlimit}`
- [ ] **2.2** 遷移 process (10min)
- runner.go, kill_unix.go, kill_windows.go, process_test.go
- [ ] **2.3** 遷移 parser (5min)
- stream.go, stream_test.go
- [ ] **2.4** 遷移 httputil (5min)
- httputil.go, httputil_test.go
- [ ] **2.5** 遷移 logger (5min)
- logger.go
- [ ] **2.6** 遷移 env (5min)
- env.go, env_test.go
- [ ] **2.7** 遷移 workspace (5min)
- workspace.go
- [ ] **2.8** 遷移 winlimit (5min)
- winlimit.go, winlimit_test.go
- [ ] **2.9** 驗證編譯 (5min)
- [ ] **2.10** 提交 (2min)
**預估時間**: ~1h
---
## Task 3: Repository Layer
### 分支
`refactor/repository`
### 依賴
Task 1 完成
### 小任務
- [ ] **3.1** 建立目錄 (1min)
- [ ] **3.2** 遷移 account.go (20min)
- AccountPool 實作
- 移除全局變數
- [ ] **3.3** 遷移 provider.go (10min)
- Provider 工廠
- [ ] **3.4** 遷移測試 (5min)
- [ ] **3.5** 驗證編譯 (5min)
- [ ] **3.6** 提交 (2min)
**預估時間**: ~1h
---
## Task 4: Provider Layer
### 分支
`refactor/provider`
### 依賴
Task 1 完成
### 小任務
- [ ] **4.1** 建立目錄 (1min)
- `pkg/provider/cursor`
- `pkg/provider/geminiweb`
- [ ] **4.2** 遷移 cursor provider (5min)
- [ ] **4.3** 遷移 geminiweb provider (10min)
- [ ] **4.4** 更新 import (5min)
- [ ] **4.5** 驗證編譯 (5min)
- [ ] **4.6** 提交 (2min)
**預估時間**: ~30min
---
## Task 5: Usecase Layer
### 分支
`refactor/usecase`
### 依賴
Task 3 完成
### 小任務
- [ ] **5.1** 建立目錄 (1min)
- [ ] **5.2** 建立 chat.go (30min)
- 核心聊天邏輯
- [ ] **5.3** 遷移 agent.go (20min)
- runner, token, cmdargs, maxmode
- [ ] **5.4** 遷移 sanitizer (10min)
- [ ] **5.5** 遷移 toolcall (10min)
- [ ] **5.6** 驗證編譯 (5min)
- [ ] **5.7** 提交 (2min)
**預估時間**: ~2h
---
## Task 6: Adapter Layer
### 分支
`refactor/adapter`
### 依賴
Task 1 完成
### 小任務
- [ ] **6.1** 建立目錄 (1min)
- [ ] **6.2** 遷移 openai adapter (10min)
- [ ] **6.3** 遷移 anthropic adapter (10min)
- [ ] **6.4** 更新 import (5min)
- [ ] **6.5** 驗證編譯 (5min)
- [ ] **6.6** 提交 (2min)
**預估時間**: ~30min
---
## Task 7: Internal Layer
### 分支
`refactor/internal`
### 依賴
Task 1-6 全部完成
### 小任務
- [ ] **7.1** 合併所有分支 (5min)
- [ ] **7.2** 更新 config/config.go (15min)
- 使用 rest.RestConf
- [ ] **7.3** 建立 svc/servicecontext.go (30min)
- DI 容器
- [ ] **7.4** 建立 logic/ (1h)
- chatcompletionlogic.go
- geminichatlogic.go
- anthropiclogic.go
- healthlogic.go
- modelslogic.go
- [ ] **7.5** 建立 handler/ (1h)
- 自訂 SSE handler
- [ ] **7.6** 建立 middleware/ (20min)
- auth.go
- recovery.go
- [ ] **7.7** 建立 types/ (5min)
- goctl 生成
- [ ] **7.8** 更新 import (30min)
- 批量更新
- [ ] **7.9** 驗證編譯 (10min)
- [ ] **7.10** 提交 (2min)
**預估時間**: ~4h
---
## Task 8: CLI Tools
### 分支
`refactor/cli`
### 依賴
Task 0 完成
### 小任務
- [ ] **8.1** 建立目錄 (1min)
- [ ] **8.2** 遷移 CLI 工具 (10min)
- [ ] **8.3** 遷移 gemini-login (5min)
- [ ] **8.4** 更新 import (5min)
- [ ] **8.5** 提交 (2min)
**預估時間**: ~30min
---
## Task 9: Cleanup & Tests
### 分支
`refactor/cleanup`
### 依賴
Task 7 完成
### 小任務
- [ ] **9.1** 移除舊目錄 (5min)
- [ ] **9.2** 更新 import (30min)
- 批量 sed
- [ ] **9.3** 建立 cmd/chat/chat.go (10min)
- [ ] **9.4** SSE 整合測試 (2h)
- [ ] **9.5** 回歸測試 (1h)
- [ ] **9.6** 更新 README (15min)
- [ ] **9.7** 提交 (2min)
**預估時間**: ~4h
---
## 並行執行計劃
### Wave 1 (可完全並行)
```
Terminal 1: Task 0 (init) → 30min
Terminal 2: (等待 Task 0)
```
### Wave 2 (可完全並行)
```
Terminal 1: Task 1 (domain) → 2h
Terminal 2: Task 2 (infrastructure) → 1h
Terminal 3: Task 8 (cli) → 30min
```
### Wave 3 (可部分並行)
```
Terminal 1: Task 3 (repository) → 1h (依賴 Task 1)
Terminal 2: Task 4 (provider) → 30min (依賴 Task 1)
Terminal 3: Task 6 (adapter) → 30min (依賴 Task 1)
Terminal 4: (等待 Task 3)
```
### Wave 4 (可部分並行)
```
Terminal 1: Task 5 (usecase) → 2h (依賴 Task 3)
Terminal 2: (等待 Task 5)
```
### Wave 5 (序列)
```
Task 7 (internal) → 4h
Task 9 (cleanup) → 4h
```
**總時間估計**:
- 完全序列: ~15h
- 並行執行: ~9h
- 節省: ~40%
---
## Git Worktree 指令
```bash
# 創建 worktrees
git worktree add ../worktrees/init -b refactor/init
git worktree add ../worktrees/domain -b refactor/domain
git worktree add ../worktrees/infrastructure -b refactor/infrastructure
git worktree add ../worktrees/repository -b refactor/repository
git worktree add ../worktrees/provider -b refactor/provider
git worktree add ../worktrees/usecase -b refactor/usecase
git worktree add ../worktrees/adapter -b refactor/adapter
git worktree add ../worktrees/cli -b refactor/cli
# 並行工作
cd ../worktrees/domain && # Terminal 1
cd ../worktrees/infrastructure && # Terminal 2
cd ../worktrees/cli && # Terminal 3
# 清理 worktrees
git worktree remove ../worktrees/init
git worktree remove ../worktrees/domain
# ... 等等
```
---
**文件版本**: v1.0
**建立日期**: 2026-04-03

View File

@ -3,6 +3,26 @@
# 編輯下方變數,然後執行 make env 產生 .env 檔 # 編輯下方變數,然後執行 make env 產生 .env 檔
# ────────────────────────────────────────────── # ──────────────────────────────────────────────
# ── go-zero 代碼生成 ───────────────────────────
.PHONY: api api-doc gen fmt lint
api:
goctl api go -api api/chat.api -dir . --style go_zero
api-doc:
goctl api doc -api api/chat.api -dir docs/api/
gen: api
go mod tidy
go fmt ./...
fmt:
go fmt ./...
lint:
go vet ./...
go fmt ./...
# ── 伺服器設定 ───────────────────────────────── # ── 伺服器設定 ─────────────────────────────────
HOST ?= 127.0.0.1 HOST ?= 127.0.0.1
PORT ?= 8766 PORT ?= 8766

141
api/chat.api Normal file
View File

@ -0,0 +1,141 @@
syntax = "v1"
info (
title: "Cursor API Proxy"
desc: "OpenAI-compatible API proxy for Cursor/Gemini"
author: "cursor-api-proxy"
version: "1.0"
)
// ============ Types ============
type (
// Health
HealthRequest {}
HealthResponse {
Status string `json:"status"`
Version string `json:"version"`
}
// Models
ModelsRequest {}
ModelsResponse {
Object string `json:"object"`
Data []ModelData `json:"data"`
}
ModelData {
Id string `json:"id"`
Object string `json:"object"`
OwnedBy string `json:"owned_by"`
}
// Chat Completions
ChatCompletionRequest {
Model string `json:"model"`
Messages []Message `json:"messages"`
Stream bool `json:"stream,optional"`
Tools []Tool `json:"tools,optional"`
Functions []Function `json:"functions,optional"`
MaxTokens int `json:"max_tokens,optional"`
Temperature float64 `json:"temperature,optional"`
}
Message {
Role string `json:"role"`
Content interface{} `json:"content"`
}
Tool {
Type string `json:"type"`
Function ToolFunction `json:"function"`
}
ToolFunction {
Name string `json:"name"`
Description string `json:"description"`
Parameters interface{} `json:"parameters"`
}
Function {
Name string `json:"name"`
Description string `json:"description,optional"`
Parameters interface{} `json:"parameters,optional"`
}
ChatCompletionResponse {
Id string `json:"id"`
Object string `json:"object"`
Created int64 `json:"created"`
Model string `json:"model"`
Choices []Choice `json:"choices"`
Usage Usage `json:"usage"`
}
Choice {
Index int `json:"index"`
Message RespMessage `json:"message,optional"`
Delta Delta `json:"delta,optional"`
FinishReason string `json:"finish_reason"`
}
RespMessage {
Role string `json:"role"`
Content string `json:"content,optional"`
ToolCalls []ToolCall `json:"tool_calls,optional"`
}
Delta {
Role string `json:"role,optional"`
Content string `json:"content,optional"`
ReasoningContent string `json:"reasoning_content,optional"`
ToolCalls []ToolCall `json:"tool_calls,optional"`
}
ToolCall {
Index int `json:"index"`
Id string `json:"id"`
Type string `json:"type"`
Function FunctionCall `json:"function"`
}
FunctionCall {
Name string `json:"name"`
Arguments string `json:"arguments"`
}
Usage {
PromptTokens int `json:"prompt_tokens"`
CompletionTokens int `json:"completion_tokens"`
TotalTokens int `json:"total_tokens"`
}
// Anthropic Messages
AnthropicRequest {
Model string `json:"model"`
Messages []Message `json:"messages"`
MaxTokens int `json:"max_tokens"`
Stream bool `json:"stream,optional"`
System string `json:"system,optional"`
}
AnthropicResponse {
Id string `json:"id"`
Type string `json:"type"`
Role string `json:"role"`
Content []ContentBlock `json:"content"`
Model string `json:"model"`
Usage AnthropicUsage `json:"usage"`
}
ContentBlock {
Type string `json:"type"`
Text string `json:"text,optional"`
}
AnthropicUsage {
InputTokens int `json:"input_tokens"`
OutputTokens int `json:"output_tokens"`
}
)
// ============ Routes ============
@server (
prefix: /v1
group: chat
)
service chat-api {
@handler Health
get /health returns (HealthResponse)
@handler Models
get /v1/models returns (ModelsResponse)
@handler ChatCompletions
post /v1/chat/completions (ChatCompletionRequest)
@handler AnthropicMessages
post /v1/messages (AnthropicRequest)
}

34
cmd/chat/chat.go Normal file
View File

@ -0,0 +1,34 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package main
import (
"flag"
"fmt"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/handler"
"cursor-api-proxy/internal/svc"
"github.com/zeromicro/go-zero/core/conf"
"github.com/zeromicro/go-zero/rest"
)
var configFile = flag.String("f", "etc/chat-api.yaml", "the config file")
func main() {
flag.Parse()
var c config.Config
conf.MustLoad(*configFile, &c)
server := rest.MustNewServer(c.RestConf)
defer server.Stop()
ctx := svc.NewServiceContext(c)
handler.RegisterHandlers(server, ctx)
fmt.Printf("Starting server at %s:%d...\n", c.Host, c.Port)
server.Start()
}

View File

@ -1,11 +1,12 @@
package cmd package cmd
import ( import (
"cursor-api-proxy/internal/agent"
"encoding/json" "encoding/json"
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
"cursor-api-proxy/pkg/usecase"
) )
type AccountInfo struct { type AccountInfo struct {
@ -86,7 +87,7 @@ func ReadAccountInfo(name, configDir string) AccountInfo {
} }
func HandleAccountsList() error { func HandleAccountsList() error {
accountsDir := agent.AccountsDir() accountsDir := usecase.AccountsDir()
entries, err := os.ReadDir(accountsDir) entries, err := os.ReadDir(accountsDir)
if err != nil { if err != nil {
@ -108,7 +109,7 @@ func HandleAccountsList() error {
fmt.Print("Cursor Accounts:\n\n") fmt.Print("Cursor Accounts:\n\n")
keychainToken := agent.ReadKeychainToken() keychainToken := usecase.ReadKeychainToken()
for i, name := range names { for i, name := range names {
configDir := filepath.Join(accountsDir, name) configDir := filepath.Join(accountsDir, name)
@ -117,7 +118,7 @@ func HandleAccountsList() error {
fmt.Printf(" %d. %s\n", i+1, name) fmt.Printf(" %d. %s\n", i+1, name)
if info.Authenticated { if info.Authenticated {
cachedToken := agent.ReadCachedToken(configDir) cachedToken := usecase.ReadCachedToken(configDir)
keychainMatchesAccount := keychainToken != "" && info.AuthID != "" && TokenSub(keychainToken) == info.AuthID keychainMatchesAccount := keychainToken != "" && info.AuthID != "" && TokenSub(keychainToken) == info.AuthID
token := cachedToken token := cachedToken
if token == "" && keychainMatchesAccount { if token == "" && keychainMatchesAccount {
@ -178,7 +179,7 @@ func HandleLogout(accountName string) error {
os.Exit(1) os.Exit(1)
} }
accountsDir := agent.AccountsDir() accountsDir := usecase.AccountsDir()
configDir := filepath.Join(accountsDir, accountName) configDir := filepath.Join(accountsDir, accountName)
if _, err := os.Stat(configDir); os.IsNotExist(err) { if _, err := os.Stat(configDir); os.IsNotExist(err) {

View File

@ -2,8 +2,6 @@ package cmd
import ( import (
"bufio" "bufio"
"cursor-api-proxy/internal/agent"
"cursor-api-proxy/internal/env"
"fmt" "fmt"
"os" "os"
"os/exec" "os/exec"
@ -12,6 +10,9 @@ import (
"regexp" "regexp"
"syscall" "syscall"
"time" "time"
"cursor-api-proxy/pkg/infrastructure/env"
"cursor-api-proxy/pkg/usecase"
) )
var loginURLRe = regexp.MustCompile(`https://cursor\.com/loginDeepControl.*?redirectTarget=cli`) var loginURLRe = regexp.MustCompile(`https://cursor\.com/loginDeepControl.*?redirectTarget=cli`)
@ -25,7 +26,7 @@ func HandleLogin(accountName string, proxies []string) error {
accountName = fmt.Sprintf("account-%d", time.Now().UnixMilli()%10000) accountName = fmt.Sprintf("account-%d", time.Now().UnixMilli()%10000)
} }
accountsDir := agent.AccountsDir() accountsDir := usecase.AccountsDir()
configDir := filepath.Join(accountsDir, accountName) configDir := filepath.Join(accountsDir, accountName)
dirWasNew := !fileExists(configDir) dirWasNew := !fileExists(configDir)
@ -110,9 +111,9 @@ func HandleLogin(accountName string, proxies []string) error {
} }
// Cache keychain token for this account // Cache keychain token for this account
token := agent.ReadKeychainToken() token := usecase.ReadKeychainToken()
if token != "" { if token != "" {
agent.WriteCachedToken(configDir, token) usecase.WriteCachedToken(configDir, token)
} }
fmt.Printf("\nAccount '%s' saved — it will be auto-discovered when you start the proxy.\n", accountName) fmt.Printf("\nAccount '%s' saved — it will be auto-discovered when you start the proxy.\n", accountName)

View File

@ -2,8 +2,8 @@ package main
import ( import (
"cursor-api-proxy/internal/config" "cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/env" "cursor-api-proxy/pkg/infrastructure/env"
"cursor-api-proxy/internal/providers/geminiweb" "cursor-api-proxy/pkg/provider/geminiweb"
"fmt" "fmt"
"os" "os"
"strings" "strings"

1069
docs/MIGRATION_PLAN.md Normal file

File diff suppressed because it is too large Load Diff

462
docs/REFACTOR_TASKS.md Normal file
View File

@ -0,0 +1,462 @@
# REFACTOR TASKS
重構任務拆分,支援 git worktree 並行開發。
---
## Task Overview
### 並行策略
```
時間軸 ──────────────────────────────────────────────────────────────►
Task 0: Init (必須先完成)
├── Task 1: Domain Layer ─────────────────────────┐
│ │
│ ┌── Task 2: Infrastructure Layer ────────────┤── 並行
│ │ │
│ └── Task 3: Repository Layer ────────────────┘
│ (依賴 Task 1)
├── Task 4: Provider Layer ──────────────────────┐
│ (依賴 Task 1) │
│ │── 可並行
├── Task 5: Usecase Layer ───────────────────────┤
│ (依賴 Task 3) │
│ │
├── Task 6: Adapter Layer ───────────────────────┘
│ (依賴 Task 1)
├── Task 7: Internal Layer ──────────────────────┐
│ (整合所有,必須最後) │
│ │── 序列
├── Task 8: CLI Tools │
│ │
└── Task 9: Cleanup & Tests ────────────────────┘
```
### Worktree 分支規劃
| 分支名稱 | 基於 | 任務 | 可並行 |
|---------|------|------|--------|
| `refactor/init` | `master` | Task 0 | ❌ |
| `refactor/domain` | `refactor/init` | Task 1 | ✅ |
| `refactor/infrastructure` | `refactor/init` | Task 2 | ✅ |
| `refactor/repository` | `refactor/domain` | Task 3 | ✅ |
| `refactor/provider` | `refactor/domain` | Task 4 | ✅ |
| `refactor/usecase` | `refactor/repository` | Task 5 | ✅ |
| `refactor/adapter` | `refactor/domain` | Task 6 | ✅ |
| `refactor/internal` | 合併所有 | Task 7 | ❌ |
| `refactor/cli` | `refactor/init` | Task 8 | ✅ |
| `refactor/cleanup` | 合併所有 | Task 9 | ❌ |
---
## Task 0: 初始化
### 分支
`refactor/init`
### 依賴
無(必須先完成)
### 小任務
- [ ] **0.1** 更新 go.mod (5min)
- `go get github.com/zeromicro/go-zero@latest`
- `go mod tidy`
- [ ] **0.2** 建立目錄 (1min)
- `mkdir -p api etc`
- [ ] **0.3** 建立 `api/chat.api` (15min)
- 定義 API types
- 定義 routes
- [ ] **0.4** 建立 `etc/chat.yaml` (5min)
- 配置參數
- [ ] **0.5** 更新 Makefile (10min)
- 新增 goctl 命令
- [ ] **0.6** 提交 (2min)
**預估時間**: ~30min
---
## Task 1: Domain Layer
### 分支
`refactor/domain`
### 依賴
Task 0 完成
### 小任務
- [ ] **1.1** 建立目錄結構 (1min)
- `pkg/domain/entity`
- `pkg/domain/repository`
- `pkg/domain/usecase`
- `pkg/domain/const`
- [ ] **1.2** `entity/message.go` (10min)
- Message, Tool, ToolFunction, ToolCall
- [ ] **1.3** `entity/chunk.go` (5min)
- StreamChunk, ChunkType
- [ ] **1.4** `entity/account.go` (5min)
- Account, AccountStat
- [ ] **1.5** `repository/account.go` (10min)
- AccountPool interface
- [ ] **1.6** `repository/provider.go` (5min)
- Provider interface
- [ ] **1.7** `usecase/chat.go` (15min)
- ChatUsecase interface
- [ ] **1.8** `usecase/agent.go` (5min)
- AgentRunner interface
- [ ] **1.9** `const/models.go` (10min)
- Model 常數
- [ ] **1.10** `const/errors.go` (5min)
- 錯誤定義
- [ ] **1.11** 提交 (2min)
**預估時間**: ~2h
---
## Task 2: Infrastructure Layer
### 分支
`refactor/infrastructure`
### 依賴
Task 0 完成(可與 Task 1 並行)
### 小任務
- [ ] **2.1** 建立目錄 (2min)
- `pkg/infrastructure/{process,parser,httputil,logger,env,workspace,winlimit}`
- [ ] **2.2** 遷移 process (10min)
- runner.go, kill_unix.go, kill_windows.go, process_test.go
- [ ] **2.3** 遷移 parser (5min)
- stream.go, stream_test.go
- [ ] **2.4** 遷移 httputil (5min)
- httputil.go, httputil_test.go
- [ ] **2.5** 遷移 logger (5min)
- logger.go
- [ ] **2.6** 遷移 env (5min)
- env.go, env_test.go
- [ ] **2.7** 遷移 workspace (5min)
- workspace.go
- [ ] **2.8** 遷移 winlimit (5min)
- winlimit.go, winlimit_test.go
- [ ] **2.9** 驗證編譯 (5min)
- [ ] **2.10** 提交 (2min)
**預估時間**: ~1h
---
## Task 3: Repository Layer
### 分支
`refactor/repository`
### 依賴
Task 1 完成
### 小任務
- [ ] **3.1** 建立目錄 (1min)
- [ ] **3.2** 遷移 account.go (20min)
- AccountPool 實作
- 移除全局變數
- [ ] **3.3** 遷移 provider.go (10min)
- Provider 工廠
- [ ] **3.4** 遷移測試 (5min)
- [ ] **3.5** 驗證編譯 (5min)
- [ ] **3.6** 提交 (2min)
**預估時間**: ~1h
---
## Task 4: Provider Layer
### 分支
`refactor/provider`
### 依賴
Task 1 完成
### 小任務
- [ ] **4.1** 建立目錄 (1min)
- `pkg/provider/cursor`
- `pkg/provider/geminiweb`
- [ ] **4.2** 遷移 cursor provider (5min)
- [ ] **4.3** 遷移 geminiweb provider (10min)
- [ ] **4.4** 更新 import (5min)
- [ ] **4.5** 驗證編譯 (5min)
- [ ] **4.6** 提交 (2min)
**預估時間**: ~30min
---
## Task 5: Usecase Layer
### 分支
`refactor/usecase`
### 依賴
Task 3 完成
### 小任務
- [ ] **5.1** 建立目錄 (1min)
- [ ] **5.2** 建立 chat.go (30min)
- 核心聊天邏輯
- [ ] **5.3** 遷移 agent.go (20min)
- runner, token, cmdargs, maxmode
- [ ] **5.4** 遷移 sanitizer (10min)
- [ ] **5.5** 遷移 toolcall (10min)
- [ ] **5.6** 驗證編譯 (5min)
- [ ] **5.7** 提交 (2min)
**預估時間**: ~2h
---
## Task 6: Adapter Layer
### 分支
`refactor/adapter`
### 依賴
Task 1 完成
### 小任務
- [ ] **6.1** 建立目錄 (1min)
- [ ] **6.2** 遷移 openai adapter (10min)
- [ ] **6.3** 遷移 anthropic adapter (10min)
- [ ] **6.4** 更新 import (5min)
- [ ] **6.5** 驗證編譯 (5min)
- [ ] **6.6** 提交 (2min)
**預估時間**: ~30min
---
## Task 7: Internal Layer
### 分支
`refactor/internal`
### 依賴
Task 1-6 全部完成
### 小任務
- [ ] **7.1** 合併所有分支 (5min)
- [ ] **7.2** 更新 config/config.go (15min)
- 使用 rest.RestConf
- [ ] **7.3** 建立 svc/servicecontext.go (30min)
- DI 容器
- [ ] **7.4** 建立 logic/ (1h)
- chatcompletionlogic.go
- geminichatlogic.go
- anthropiclogic.go
- healthlogic.go
- modelslogic.go
- [ ] **7.5** 建立 handler/ (1h)
- 自訂 SSE handler
- [ ] **7.6** 建立 middleware/ (20min)
- auth.go
- recovery.go
- [ ] **7.7** 建立 types/ (5min)
- goctl 生成
- [ ] **7.8** 更新 import (30min)
- 批量更新
- [ ] **7.9** 驗證編譯 (10min)
- [ ] **7.10** 提交 (2min)
**預估時間**: ~4h
---
## Task 8: CLI Tools
### 分支
`refactor/cli`
### 依賴
Task 0 完成
### 小任務
- [ ] **8.1** 建立目錄 (1min)
- [ ] **8.2** 遷移 CLI 工具 (10min)
- [ ] **8.3** 遷移 gemini-login (5min)
- [ ] **8.4** 更新 import (5min)
- [ ] **8.5** 提交 (2min)
**預估時間**: ~30min
---
## Task 9: Cleanup & Tests
### 分支
`refactor/cleanup`
### 依賴
Task 7 完成
### 小任務
- [ ] **9.1** 移除舊目錄 (5min)
- [ ] **9.2** 更新 import (30min)
- 批量 sed
- [ ] **9.3** 建立 cmd/chat/chat.go (10min)
- [ ] **9.4** SSE 整合測試 (2h)
- [ ] **9.5** 回歸測試 (1h)
- [ ] **9.6** 更新 README (15min)
- [ ] **9.7** 提交 (2min)
**預估時間**: ~4h
---
## 並行執行計劃
### Wave 1 (可完全並行)
```
Terminal 1: Task 0 (init) → 30min
Terminal 2: (等待 Task 0)
```
### Wave 2 (可完全並行)
```
Terminal 1: Task 1 (domain) → 2h
Terminal 2: Task 2 (infrastructure) → 1h
Terminal 3: Task 8 (cli) → 30min
```
### Wave 3 (可部分並行)
```
Terminal 1: Task 3 (repository) → 1h (依賴 Task 1)
Terminal 2: Task 4 (provider) → 30min (依賴 Task 1)
Terminal 3: Task 6 (adapter) → 30min (依賴 Task 1)
Terminal 4: (等待 Task 3)
```
### Wave 4 (可部分並行)
```
Terminal 1: Task 5 (usecase) → 2h (依賴 Task 3)
Terminal 2: (等待 Task 5)
```
### Wave 5 (序列)
```
Task 7 (internal) → 4h
Task 9 (cleanup) → 4h
```
**總時間估計**:
- 完全序列: ~15h
- 並行執行: ~9h
- 節省: ~40%
---
## Git Worktree 指令
```bash
# 創建 worktrees
git worktree add ../worktrees/init -b refactor/init
git worktree add ../worktrees/domain -b refactor/domain
git worktree add ../worktrees/infrastructure -b refactor/infrastructure
git worktree add ../worktrees/repository -b refactor/repository
git worktree add ../worktrees/provider -b refactor/provider
git worktree add ../worktrees/usecase -b refactor/usecase
git worktree add ../worktrees/adapter -b refactor/adapter
git worktree add ../worktrees/cli -b refactor/cli
# 並行工作
cd ../worktrees/domain && # Terminal 1
cd ../worktrees/infrastructure && # Terminal 2
cd ../worktrees/cli && # Terminal 3
# 清理 worktrees
git worktree remove ../worktrees/init
git worktree remove ../worktrees/domain
# ... 等等
```
---
**文件版本**: v1.0
**建立日期**: 2026-04-03

312
docs/TODOS.md Normal file
View File

@ -0,0 +1,312 @@
# TODOS
重構 cursor-api-proxy → go-zero + DDD Architecture 的待辦事項。
---
## Phase 1: API 定義與骨架生成
### DONE
- [x] 建立 `api/chat.api` 定義檔
- [x] 建立 `etc/chat.yaml` 配置檔
- [x] 生成代碼骨架
- [x] 移動 `chat.go``cmd/chat/`
### TODO
#### TODO-1: 全局變數遷移清單
- **What**: 建立全局變數到 ServiceContext 的遷移清單
- **Why**: 現有代碼有多個全局變數,遷移時容易遺漏
- **Files**:
- `internal/pool/pool.go:36-38``globalPool`, `globalMu` → ServiceContext
- `internal/process/process.go:117``MaxModeFn` → ServiceContext
- `internal/handlers/chat.go:28-29``rateLimitRe`, `retryAfterRe` → ServiceContext 或常數
- `internal/models/cursormap.go:8,47,51` → 正則表達式常數化
- **Decision**: ServiceContext 注入
- **Effort**: human ~2h / CC ~30min
- **Depends on**: Phase 2 (Domain 層建立)
- **Status**: pending
#### TODO-2: go.mod 更新
- **What**: 添加 go-zero 依賴到 go.mod
- **Why**: 現有 go.mod 沒有 go-zero 依賴
- **Command**: `go get github.com/zeromicro/go-zero@latest`
- **Decision**: 使用最新穩定版
- **Effort**: human ~5min / CC ~1min
- **Depends on**: Phase 1 開始前
- **Status**: pending
#### TODO-3: Makefile 更新
- **What**: 更新 Makefile 以支援 go-zero 的建置流程
- **Why**: 需要新增 goctl 命令和整合現有 env/run 命令
- **Commands to add**:
```makefile
.PHONY: api
api:
goctl api go -api api/chat.api -dir . --style go_zero
.PHONY: api-doc
api-doc:
goctl api doc -api api/chat.api -dir docs/
.PHONY: gen
gen: api
go mod tidy
```
- **Decision**: 需要追蹤
- **Effort**: human ~30min / CC ~10min
- **Depends on**: Phase 1 (API 定義與骨架生成)
- **Status**: pending
---
## Phase 2: Domain 層建立
### DONE
- [ ] 建立 `pkg/domain/entity/`
- [ ] 建立 `pkg/domain/repository/`
- [ ] 建立 `pkg/domain/usecase/`
- [ ] 建立 `pkg/domain/const/`
### TODO
#### TODO-4: import 循環依賴檢測
- **What**: 在每個 Phase 完成後執行 `go build ./...` 檢測循環依賴
- **Why**: DDD 架構分層容易產生循環依賴
- **Potential cycles**:
- `pkg/usecase``pkg/domain/usecase`
- `pkg/repository``pkg/domain/repository`
- **Command**: `go build ./... && go test ./... -run=none`
- **Depends on**: 每個 Phase 完成後
- **Status**: pending
---
## Phase 8: Internal 層重組
### DONE
- [ ] 更新 `internal/config/config.go`
- [ ] 建立 `internal/svc/servicecontext.go`
- [ ] 建立 `internal/logic/`
- [ ] 建立 `internal/handler/`
- [ ] 建立 `internal/middleware/`
### TODO
#### TODO-5: SSE 整合測試
- **What**: 增加 SSE streaming 的端對端測試
- **Why**: SSE 是核心功能,自訂 handler 容易出錯,沒有測試覆蓋
- **Test cases**:
1. SSE streaming 請求正常返回
2. SSE client disconnect 正確處理
3. SSE timeout 正確處理
4. 非串流請求轉 SSE 格式
- **Implementation**:
```go
// tests/integration/sse_test.go
func TestSSEStreaming(t *testing.T) {
// 使用 httptest 模擬 SSE 客戶端
// 驗證 data: [DONE] 正確返回
}
```
- **Decision**: 使用 `rest.WithCustom` 路由
- **Effort**: human ~2h / CC ~30min
- **Depends on**: Phase 8 完成Internal 層重組)
- **Status**: pending
#### TODO-6: SSE Handler 實作
- **What**: 使用 `rest.WithCustom` 實作 SSE streaming handler
- **Why**: go-zero 標準 handler 不支援 SSE需要自訂
- **Implementation**:
```go
// internal/handler/chat_handler.go
func NewChatHandler(svcCtx *svc.ServiceContext) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
// SSE 設定
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("Connection", "keep-alive")
// 委託給 usecase
svcCtx.ChatUsecase.Stream(r.Context(), input, callback)
}
}
```
- **Decision**: 使用 `rest.WithCustom` 路由
- **Effort**: human ~2h / CC ~30min
- **Depends on**: Phase 6 (Usecase 層建立)
- **Status**: pending
---
## Phase 10: 清理與測試
### DONE
- [ ] 移除舊目錄
- [ ] 更新 import 路徑
- [ ] 執行測試
### TODO
#### TODO-7: 測試文件遷移
- **What**: 測試文件跟隨源碼遷移到 pkg/
- **Why**: 測試應該與源碼在同一目錄
- **Files to migrate**:
- `internal/httputil/httputil_test.go``pkg/infrastructure/httputil/`
- `internal/config/config_test.go``internal/config/` (保留)
- `internal/sanitize/sanitize_test.go``pkg/usecase/`
- `internal/models/cursormap_test.go``pkg/domain/const/`
- `internal/models/cursorcli_test.go``pkg/domain/const/`
- `internal/parser/stream_test.go``pkg/infrastructure/parser/`
- `internal/env/env_test.go``pkg/infrastructure/env/`
- `internal/winlimit/winlimit_test.go``pkg/infrastructure/winlimit/`
- `internal/anthropic/anthropic_test.go``pkg/adapter/anthropic/`
- `internal/pool/pool_test.go``pkg/repository/`
- `internal/openai/openai_test.go``pkg/adapter/openai/`
- `internal/process/process_test.go``pkg/infrastructure/process/`
- **Decision**: 測試遷移到 pkg/
- **Effort**: human ~1h / CC ~10min
- **Depends on**: Phase 3-7 完成
- **Status**: pending
#### TODO-8: ServiceContext 單例 Pool
- **What**: AccountPool 使用單例模式,透過 sync.Once 確保只初始化一次
- **Why**: 避免每次請求創建新 Pool 的開銷
- **Implementation**:
```go
// pkg/repository/account.go
var (
globalPool *AccountPool
globalPoolOnce sync.Once
)
func GetAccountPool(configDirs []string) *AccountPool {
globalPoolOnce.Do(func() {
globalPool = NewAccountPool(configDirs)
})
return globalPool
}
```
- **Decision**: 使用單例 Pool
- **Effort**: human ~30min / CC ~10min
- **Depends on**: Phase 4 (Repository 層實作)
- **Status**: pending
---
## Phase 獨立 TODO
### TODO-9: 回歸測試自動化
- **What**: 建立自動化回歸測試腳本
- **Why**: 確保每次遷移後功能正常
- **Script**:
```bash
# scripts/regression-test.sh
#!/bin/bash
set -e
echo "=== Health check ==="
curl -s http://localhost:8080/health | jq .
echo "=== Models list ==="
curl -s http://localhost:8080/v1/models | jq .
echo "=== Chat completion (non-streaming) ==="
curl -s -X POST http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"test","messages":[{"role":"user","content":"hi"}],"stream":false}' | jq .
echo "=== Chat completion (streaming) ==="
curl -s -X POST http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"test","messages":[{"role":"user","content":"hi"}],"stream":true}'
```
- **Depends on**: Phase 10 完成
- **Status**: pending
---
## Summary
| TODO | Phase | Effort | Status |
|------|-------|--------|--------|
| TODO-1: 全局變數遷移清單 | Phase 2 | 2h | pending |
| TODO-2: go.mod 更新 | Phase 1 | 5min | pending |
| TODO-3: Makefile 更新 | Phase 1 | 30min | pending |
| TODO-4: import 循環依賴檢測 | Each Phase | 5min | pending |
| TODO-5: SSE 整合測試 | Phase 8 | 2h | pending |
| TODO-6: SSE Handler 實作 | Phase 8 | 2h | pending |
| TODO-7: 測試文件遷移 | Phase 10 | 1h | pending |
| TODO-8: ServiceContext 單例 Pool | Phase 4 | 30min | pending |
| TODO-9: 回歸測試自動化 | Phase 10 | 30min | pending |
---
## Dependencies Graph
```
Phase 1 (API 定義)
├── TODO-2: go.mod 更新 (必须在開始前完成)
├── TODO-3: Makefile 更新
Phase 2 (Domain 層)
├── TODO-1: 全局變數遷移清單
├── TODO-4: import 循環依賴檢測
Phase 3 (Infrastructure 層)
├── TODO-4: import 循環依賴檢測
Phase 4 (Repository 層)
├── TODO-8: ServiceContext 單例 Pool
├── TODO-4: import 循環依賴檢測
Phase 5 (Provider 層)
├── TODO-4: import 循環依賴檢測
Phase 6 (Usecase 層)
├── TODO-4: import 循環依賴檢測
Phase 7 (Adapter 層)
├── TODO-4: import 循環依賴檢測
Phase 8 (Internal 層)
├── TODO-5: SSE 整合測試
├── TODO-6: SSE Handler 實作
├── TODO-4: import 循環依賴檢測
Phase 9 (CLI 工具)
├── TODO-4: import 循環依賴檢測
Phase 10 (清理與測試)
├── TODO-7: 測試文件遷移
├── TODO-9: 回歸測試自動化
├── TODO-4: import 循環依賴檢測
完成
```
---
**文件版本**: v1.0
**建立日期**: 2026-04-03
**最後更新**: 2026-04-03

29
docs/refactor.md Normal file
View File

@ -0,0 +1,29 @@
```mermaid
.
├── build/ # [Infrastructure] 存放 Dockerfile 與建置相關腳本
├── docker-compose.yml # [Infrastructure] 本地開發環境編排 (Mongo, Redis, etc.)
├── etc/ # [Configuration] 存放各環境的 yaml 設定檔範本
├── generate/ # [Contract] Interface First 定義區
│ └── api/ # 存放 .api 原始定義,作為服務間的通訊契約
│ └── database/ # 如果有必要請幫我放建立db 的檔案
├── internal/ # [Framework Layer] 強依賴 go-zero 框架的實作區
│ ├── config/ # 框架層的 Config mapping
│ ├── logic/ # Adapter: 負責將框架 Request 轉接至 pkg/usecase
│ ├── server/ # Transport: gRPC/HTTP Server 實作 (僅處理協議)
│ └── svc/ # DI Center: 依賴注入中心,管理全域 Resource (DB, Client)
├── pkg/ # [Core Domain] 核心業務邏輯 (不依賴 go-zero 框架)
│ ├── domain/ # <Domain Layer>
│ │ ├── entity/ # 純粹的業務物件 (POJO/POCO),不含資料庫標籤
│ │ ├── repository/ # Repository Interface: 定義資料存取規範 (DIP)
│ │ ├── const/ # 常數
│ │ └── usecase/ # Usecase Interface: 定義業務功能的 API 契約
│ ├── repository/ # <Infrastructure Layer>
│ │ ├── *_test.go # 使用 Testcontainers (Real DB) 進行整合測試
│ │ └── *.go # 實作 domain/repository 接口 (MongoDB/Redis)
│ └── usecase/ # <Application Layer>
│ ├── *_test.go # 核心業務邏輯的 Unit Test (使用 Mock)
│ └── *.go # 實作業務流程,協調 Repository 與 Utils
├── Makefile # [Automation] 封裝 protoc, test, build 等常用指令
├── go.mod # [Dependency]
└── main.go # [Entry] 服務啟動進入點
```

42
etc/chat-api.yaml Normal file
View File

@ -0,0 +1,42 @@
Name: api-proxy
Host: 0.0.0.0
Port: 8766
Timeout: 3600000
# Cursor Agent 配置
AgentBin: /Users/daniel/.local/bin/agent
DefaultModel: claude-4.5-sonnet
Provider: cursor
TimeoutMs: 3600000
# 多帳號池配置
ConfigDirs:
- ~/.cursor-api-proxy/accounts/default
MultiPort: false
# TLS 憑證(可選)
TLSCertPath: ""
TLSKeyPath: ""
# 日誌
SessionsLogPath: ""
# Verbose 使用 RestConf 預設值
# Gemini Web Provider 配置
GeminiAccountDir: ~/.cursor-api-proxy/gemini-accounts
GeminiBrowserVisible: false
GeminiMaxSessions: 10
# 工作區配置
Workspace: ""
ChatOnlyWorkspace: true
WinCmdlineMax: 32768
# Agent 行為
Force: false
ApproveMcps: false
MaxMode: false
StrictModel: true
# API Key可選留空則不驗證
RequiredKey: ""

45
etc/chat.yaml Normal file
View File

@ -0,0 +1,45 @@
Name: chat-api
Host: ${CURSOR_BRIDGE_HOST:0.0.0.0}
Port: ${CURSOR_BRIDGE_PORT:8080}
# API Key 驗證(可選)
# Auth:
# AccessSecret: ${CURSOR_API_KEY:}
# AccessExpire: 86400
# Cursor 配置
AgentBin: ${CURSOR_AGENT_BIN:cursor-agent}
DefaultModel: ${CURSOR_DEFAULT_MODEL:claude-3.5-sonnet}
Provider: ${CURSOR_PROVIDER:cursor}
# 超時設定
TimeoutMs: ${CURSOR_TIMEOUT_MS:300000}
# 多帳號池
ConfigDirs:
- ${HOME}/.cursor-api-proxy/accounts/default
MultiPort: false
# TLS
TLSCertPath: ${CURSOR_TLS_CERT_PATH:}
TLSKeyPath: ${CURSOR_TLS_KEY_PATH:}
# 日誌
SessionsLogPath: ${CURSOR_SESSIONS_LOG_PATH:}
Verbose: ${CURSOR_VERBOSE:false}
# Gemini 設定
GeminiAccountDir: ${GEMINI_ACCOUNT_DIR:}
GeminiBrowserVisible: ${GEMINI_BROWSER_VISIBLE:false}
GeminiMaxSessions: ${GEMINI_MAX_SESSIONS:10}
# 工作區設定
Workspace: ${CURSOR_WORKSPACE:}
ChatOnlyWorkspace: ${CURSOR_CHAT_ONLY_WORKSPACE:true}
WinCmdlineMax: ${CURSOR_WIN_CMDLINE_MAX:32768}
# Agent 設定
Force: ${CURSOR_FORCE:false}
ApproveMcps: ${CURSOR_APPROVE_MCPS:false}
MaxMode: ${CURSOR_MAX_MODE:false}
StrictModel: ${CURSOR_STRICT_MODEL:true}

46
go.mod
View File

@ -3,26 +3,68 @@ module cursor-api-proxy
go 1.25.0 go 1.25.0
require ( require (
github.com/go-rod/rod v0.116.2
github.com/google/uuid v1.6.0 github.com/google/uuid v1.6.0
github.com/playwright-community/playwright-go v0.5700.1
github.com/zeromicro/go-zero v1.10.1
modernc.org/sqlite v1.48.0 modernc.org/sqlite v1.48.0
) )
require ( require (
github.com/beorn7/perks v1.0.1 // indirect
github.com/cenkalti/backoff/v5 v5.0.3 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/deckarep/golang-set/v2 v2.8.0 // indirect github.com/deckarep/golang-set/v2 v2.8.0 // indirect
github.com/dustin/go-humanize v1.0.1 // indirect github.com/dustin/go-humanize v1.0.1 // indirect
github.com/fatih/color v1.18.0 // indirect
github.com/go-jose/go-jose/v3 v3.0.5 // indirect github.com/go-jose/go-jose/v3 v3.0.5 // indirect
github.com/go-rod/rod v0.116.2 // indirect github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/go-stack/stack v1.8.1 // indirect github.com/go-stack/stack v1.8.1 // indirect
github.com/golang-jwt/jwt/v4 v4.5.2 // indirect
github.com/grafana/pyroscope-go v1.2.8 // indirect
github.com/grafana/pyroscope-go/godeltaprof v0.1.9 // indirect
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.7 // indirect
github.com/klauspost/compress v1.18.0 // indirect
github.com/mattn/go-colorable v0.1.13 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect github.com/mattn/go-isatty v0.0.20 // indirect
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/ncruces/go-strftime v1.0.0 // indirect github.com/ncruces/go-strftime v1.0.0 // indirect
github.com/playwright-community/playwright-go v0.5700.1 // indirect github.com/openzipkin/zipkin-go v0.4.3 // indirect
github.com/pelletier/go-toml/v2 v2.3.0 // indirect
github.com/prometheus/client_golang v1.23.2 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/common v0.66.1 // indirect
github.com/prometheus/procfs v0.16.1 // indirect
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
github.com/spaolacci/murmur3 v1.1.0 // indirect
github.com/titanous/json5 v1.0.0 // indirect
github.com/ysmood/fetchup v0.2.3 // indirect github.com/ysmood/fetchup v0.2.3 // indirect
github.com/ysmood/goob v0.4.0 // indirect github.com/ysmood/goob v0.4.0 // indirect
github.com/ysmood/got v0.40.0 // indirect github.com/ysmood/got v0.40.0 // indirect
github.com/ysmood/gson v0.7.3 // indirect github.com/ysmood/gson v0.7.3 // indirect
github.com/ysmood/leakless v0.9.0 // indirect github.com/ysmood/leakless v0.9.0 // indirect
go.opentelemetry.io/auto/sdk v1.2.1 // indirect
go.opentelemetry.io/otel v1.40.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.40.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.40.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.40.0 // indirect
go.opentelemetry.io/otel/exporters/stdout/stdouttrace v1.40.0 // indirect
go.opentelemetry.io/otel/exporters/zipkin v1.40.0 // indirect
go.opentelemetry.io/otel/metric v1.40.0 // indirect
go.opentelemetry.io/otel/sdk v1.40.0 // indirect
go.opentelemetry.io/otel/trace v1.40.0 // indirect
go.opentelemetry.io/proto/otlp v1.9.0 // indirect
go.uber.org/automaxprocs v1.6.0 // indirect
go.yaml.in/yaml/v2 v2.4.2 // indirect
golang.org/x/net v0.50.0 // indirect
golang.org/x/sys v0.42.0 // indirect golang.org/x/sys v0.42.0 // indirect
golang.org/x/text v0.34.0 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20260128011058-8636f8732409 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20260128011058-8636f8732409 // indirect
google.golang.org/grpc v1.79.3 // indirect
google.golang.org/protobuf v1.36.11 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
modernc.org/libc v1.70.0 // indirect modernc.org/libc v1.70.0 // indirect
modernc.org/mathutil v1.7.1 // indirect modernc.org/mathutil v1.7.1 // indirect
modernc.org/memory v1.11.0 // indirect modernc.org/memory v1.11.0 // indirect

131
go.sum
View File

@ -1,44 +1,148 @@
github.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
github.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=
github.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F97BxZthm/crw=
github.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=
github.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/deckarep/golang-set/v2 v2.8.0 h1:swm0rlPCmdWn9mESxKOjWk8hXSqoxOp+ZlfuyaAdFlQ= github.com/deckarep/golang-set/v2 v2.8.0 h1:swm0rlPCmdWn9mESxKOjWk8hXSqoxOp+ZlfuyaAdFlQ=
github.com/deckarep/golang-set/v2 v2.8.0/go.mod h1:VAky9rY/yGXJOLEDv3OMci+7wtDpOF4IN+y82NBOac4= github.com/deckarep/golang-set/v2 v2.8.0/go.mod h1:VAky9rY/yGXJOLEDv3OMci+7wtDpOF4IN+y82NBOac4=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY= github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto= github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
github.com/fatih/color v1.18.0 h1:S8gINlzdQ840/4pfAwic/ZE0djQEH3wM94VfqLTZcOM=
github.com/fatih/color v1.18.0/go.mod h1:4FelSpRwEGDpQ12mAdzqdOukCy4u8WUtOY6lkT/6HfU=
github.com/go-jose/go-jose/v3 v3.0.5 h1:BLLJWbC4nMZOfuPVxoZIxeYsn6Nl2r1fITaJ78UQlVQ= github.com/go-jose/go-jose/v3 v3.0.5 h1:BLLJWbC4nMZOfuPVxoZIxeYsn6Nl2r1fITaJ78UQlVQ=
github.com/go-jose/go-jose/v3 v3.0.5/go.mod h1:5b+7YgP7ZICgJDBdfjZaIt+H/9L9T/YQrVfLAMboGkQ= github.com/go-jose/go-jose/v3 v3.0.5/go.mod h1:5b+7YgP7ZICgJDBdfjZaIt+H/9L9T/YQrVfLAMboGkQ=
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
github.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=
github.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
github.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=
github.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=
github.com/go-rod/rod v0.116.2 h1:A5t2Ky2A+5eD/ZJQr1EfsQSe5rms5Xof/qj296e+ZqA= github.com/go-rod/rod v0.116.2 h1:A5t2Ky2A+5eD/ZJQr1EfsQSe5rms5Xof/qj296e+ZqA=
github.com/go-rod/rod v0.116.2/go.mod h1:H+CMO9SCNc2TJ2WfrG+pKhITz57uGNYU43qYHh438Mg= github.com/go-rod/rod v0.116.2/go.mod h1:H+CMO9SCNc2TJ2WfrG+pKhITz57uGNYU43qYHh438Mg=
github.com/go-stack/stack v1.8.1 h1:ntEHSVwIt7PNXNpgPmVfMrNhLtgjlmnZha2kOpuRiDw= github.com/go-stack/stack v1.8.1 h1:ntEHSVwIt7PNXNpgPmVfMrNhLtgjlmnZha2kOpuRiDw=
github.com/go-stack/stack v1.8.1/go.mod h1:dcoOX6HbPZSZptuspn9bctJ+N/CnF5gGygcUP3XYfe4= github.com/go-stack/stack v1.8.1/go.mod h1:dcoOX6HbPZSZptuspn9bctJ+N/CnF5gGygcUP3XYfe4=
github.com/golang-jwt/jwt/v4 v4.5.2 h1:YtQM7lnr8iZ+j5q71MGKkNw9Mn7AjHM68uc9g5fXeUI=
github.com/golang-jwt/jwt/v4 v4.5.2/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY= github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/pprof v0.0.0-20250317173921-a4b03ec1a45e h1:ijClszYn+mADRFY17kjQEVQ1XRhq2/JR1M3sGqeJoxs= github.com/google/pprof v0.0.0-20250317173921-a4b03ec1a45e h1:ijClszYn+mADRFY17kjQEVQ1XRhq2/JR1M3sGqeJoxs=
github.com/google/pprof v0.0.0-20250317173921-a4b03ec1a45e/go.mod h1:boTsfXsheKC2y+lKOCMpSfarhxDeIzfZG1jqGcPl3cA= github.com/google/pprof v0.0.0-20250317173921-a4b03ec1a45e/go.mod h1:boTsfXsheKC2y+lKOCMpSfarhxDeIzfZG1jqGcPl3cA=
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0= github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo= github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/grafana/pyroscope-go v1.2.8 h1:UvCwIhlx9DeV7F6TW/z8q1Mi4PIm3vuUJ2ZlCEvmA4M=
github.com/grafana/pyroscope-go v1.2.8/go.mod h1:SSi59eQ1/zmKoY/BKwa5rSFsJaq+242Bcrr4wPix1g8=
github.com/grafana/pyroscope-go/godeltaprof v0.1.9 h1:c1Us8i6eSmkW+Ez05d3co8kasnuOY813tbMN8i/a3Og=
github.com/grafana/pyroscope-go/godeltaprof v0.1.9/go.mod h1:2+l7K7twW49Ct4wFluZD3tZ6e0SjanjcUUBPVD/UuGU=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.7 h1:X+2YciYSxvMQK0UZ7sg45ZVabVZBeBuvMkmuI2V3Fak=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.7/go.mod h1:lW34nIZuQ8UDPdkon5fmfp2l3+ZkQ2me/+oecHYLOII=
github.com/h2non/parth v0.0.0-20190131123155-b4df798d6542 h1:2VTzZjLZBgl62/EtslCrtky5vbi9dd7HrQPQIx6wqiw=
github.com/h2non/parth v0.0.0-20190131123155-b4df798d6542/go.mod h1:Ow0tF8D4Kplbc8s8sSb3V2oUCygFHVp8gC3Dn6U4MNI=
github.com/hashicorp/golang-lru/v2 v2.0.7 h1:a+bsQ5rvGLjzHuww6tVxozPZFVghXaHOwFs4luLUK2k= github.com/hashicorp/golang-lru/v2 v2.0.7 h1:a+bsQ5rvGLjzHuww6tVxozPZFVghXaHOwFs4luLUK2k=
github.com/hashicorp/golang-lru/v2 v2.0.7/go.mod h1:QeFd9opnmA6QUJc5vARoKUSoFhyfM2/ZepoAG6RGpeM= github.com/hashicorp/golang-lru/v2 v2.0.7/go.mod h1:QeFd9opnmA6QUJc5vARoKUSoFhyfM2/ZepoAG6RGpeM=
github.com/klauspost/compress v1.18.0 h1:c/Cqfb0r+Yi+JtIEq73FWXVkRonBlf0CRNYc8Zttxdo=
github.com/klauspost/compress v1.18.0/go.mod h1:2Pp+KzxcywXVXMr50+X0Q/Lsb43OQHYWRCY2AiWywWQ=
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=
github.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=
github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA=
github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg=
github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY= github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y= github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=
github.com/ncruces/go-strftime v1.0.0 h1:HMFp8mLCTPp341M/ZnA4qaf7ZlsbTc+miZjCLOFAw7w= github.com/ncruces/go-strftime v1.0.0 h1:HMFp8mLCTPp341M/ZnA4qaf7ZlsbTc+miZjCLOFAw7w=
github.com/ncruces/go-strftime v1.0.0/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls= github.com/ncruces/go-strftime v1.0.0/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls=
github.com/openzipkin/zipkin-go v0.4.3 h1:9EGwpqkgnwdEIJ+Od7QVSEIH+ocmm5nPat0G7sjsSdg=
github.com/openzipkin/zipkin-go v0.4.3/go.mod h1:M9wCJZFWCo2RiY+o1eBCEMe0Dp2S5LDHcMZmk3RmK7c=
github.com/pelletier/go-toml/v2 v2.3.0 h1:k59bC/lIZREW0/iVaQR8nDHxVq8OVlIzYCOJf421CaM=
github.com/pelletier/go-toml/v2 v2.3.0/go.mod h1:2gIqNv+qfxSVS7cM2xJQKtLSTLUE9V8t9Stt+h56mCY=
github.com/playwright-community/playwright-go v0.5700.1 h1:PNFb1byWqrTT720rEO0JL88C6Ju0EmUnR5deFLvtP/U= github.com/playwright-community/playwright-go v0.5700.1 h1:PNFb1byWqrTT720rEO0JL88C6Ju0EmUnR5deFLvtP/U=
github.com/playwright-community/playwright-go v0.5700.1/go.mod h1:MlSn1dZrx8rszbCxY6x3qK89ZesJUYVx21B2JnkoNF0= github.com/playwright-community/playwright-go v0.5700.1/go.mod h1:MlSn1dZrx8rszbCxY6x3qK89ZesJUYVx21B2JnkoNF0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/prashantv/gostub v1.1.0 h1:BTyx3RfQjRHnUWaGF9oQos79AlQ5k8WNktv7VGvVH4g=
github.com/prashantv/gostub v1.1.0/go.mod h1:A5zLQHz7ieHGG7is6LLXLz7I8+3LZzsrV0P1IAHhP5U=
github.com/prometheus/client_golang v1.23.2 h1:Je96obch5RDVy3FDMndoUsjAhG5Edi49h0RJWRi/o0o=
github.com/prometheus/client_golang v1.23.2/go.mod h1:Tb1a6LWHB3/SPIzCoaDXI4I8UHKeFTEQ1YCr+0Gyqmg=
github.com/prometheus/client_model v0.6.2 h1:oBsgwpGs7iVziMvrGhE53c/GrLUsZdHnqNwqPLxwZyk=
github.com/prometheus/client_model v0.6.2/go.mod h1:y3m2F6Gdpfy6Ut/GBsUqTWZqCUvMVzSfMLjcu6wAwpE=
github.com/prometheus/common v0.66.1 h1:h5E0h5/Y8niHc5DlaLlWLArTQI7tMrsfQjHV+d9ZoGs=
github.com/prometheus/common v0.66.1/go.mod h1:gcaUsgf3KfRSwHY4dIMXLPV0K/Wg1oZ8+SbZk/HH/dA=
github.com/prometheus/procfs v0.16.1 h1:hZ15bTNuirocR6u0JZ6BAHHmwS1p8B4P6MRqxtzMyRg=
github.com/prometheus/procfs v0.16.1/go.mod h1:teAbpZRB1iIAJYREa1LsoWUXykVXA1KlTmWl8x/U+Is=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE= github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE=
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo= github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo=
github.com/robertkrimen/otto v0.2.1 h1:FVP0PJ0AHIjC+N4pKCG9yCDz6LHNPCwi/GKID5pGGF0=
github.com/robertkrimen/otto v0.2.1/go.mod h1:UPwtJ1Xu7JrLcZjNWN8orJaM5n5YEtqL//farB5FlRY=
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
github.com/spaolacci/murmur3 v1.1.0 h1:7c1g84S4BPRrfL5Xrdp6fOJ206sU9y293DDHaoy0bLI=
github.com/spaolacci/murmur3 v1.1.0/go.mod h1:JwIasOWyU6f++ZhiEuf87xNszmSA2myDM2Kzu9HwQUA=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.5.2 h1:xuMeJ0Sdp5ZMRXx/aWO6RZxdr3beISkG5/G/aIRr3pY=
github.com/stretchr/objx v0.5.2/go.mod h1:FRsXN1f5AsAjCGJKqEizvkpNtU+EGNCLh3NxZ/8L+MA=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/titanous/json5 v1.0.0 h1:hJf8Su1d9NuI/ffpxgxQfxh/UiBFZX7bMPid0rIL/7s=
github.com/titanous/json5 v1.0.0/go.mod h1:7JH1M8/LHKc6cyP5o5g3CSaRj+mBrIimTxzpvmckH8c=
github.com/ysmood/fetchup v0.2.3 h1:ulX+SonA0Vma5zUFXtv52Kzip/xe7aj4vqT5AJwQ+ZQ= github.com/ysmood/fetchup v0.2.3 h1:ulX+SonA0Vma5zUFXtv52Kzip/xe7aj4vqT5AJwQ+ZQ=
github.com/ysmood/fetchup v0.2.3/go.mod h1:xhibcRKziSvol0H1/pj33dnKrYyI2ebIvz5cOOkYGns= github.com/ysmood/fetchup v0.2.3/go.mod h1:xhibcRKziSvol0H1/pj33dnKrYyI2ebIvz5cOOkYGns=
github.com/ysmood/goob v0.4.0 h1:HsxXhyLBeGzWXnqVKtmT9qM7EuVs/XOgkX7T6r1o1AQ= github.com/ysmood/goob v0.4.0 h1:HsxXhyLBeGzWXnqVKtmT9qM7EuVs/XOgkX7T6r1o1AQ=
github.com/ysmood/goob v0.4.0/go.mod h1:u6yx7ZhS4Exf2MwciFr6nIM8knHQIE22lFpWHnfql18= github.com/ysmood/goob v0.4.0/go.mod h1:u6yx7ZhS4Exf2MwciFr6nIM8knHQIE22lFpWHnfql18=
github.com/ysmood/gop v0.2.0 h1:+tFrG0TWPxT6p9ZaZs+VY+opCvHU8/3Fk6BaNv6kqKg=
github.com/ysmood/gop v0.2.0/go.mod h1:rr5z2z27oGEbyB787hpEcx4ab8cCiPnKxn0SUHt6xzk=
github.com/ysmood/got v0.40.0 h1:ZQk1B55zIvS7zflRrkGfPDrPG3d7+JOza1ZkNxcc74Q= github.com/ysmood/got v0.40.0 h1:ZQk1B55zIvS7zflRrkGfPDrPG3d7+JOza1ZkNxcc74Q=
github.com/ysmood/got v0.40.0/go.mod h1:W7DdpuX6skL3NszLmAsC5hT7JAhuLZhByVzHTq874Qg= github.com/ysmood/got v0.40.0/go.mod h1:W7DdpuX6skL3NszLmAsC5hT7JAhuLZhByVzHTq874Qg=
github.com/ysmood/gotrace v0.6.0 h1:SyI1d4jclswLhg7SWTL6os3L1WOKeNn/ZtzVQF8QmdY=
github.com/ysmood/gotrace v0.6.0/go.mod h1:TzhIG7nHDry5//eYZDYcTzuJLYQIkykJzCRIo4/dzQM= github.com/ysmood/gotrace v0.6.0/go.mod h1:TzhIG7nHDry5//eYZDYcTzuJLYQIkykJzCRIo4/dzQM=
github.com/ysmood/gson v0.7.3 h1:QFkWbTH8MxyUTKPkVWAENJhxqdBa4lYTQWqZCiLG6kE= github.com/ysmood/gson v0.7.3 h1:QFkWbTH8MxyUTKPkVWAENJhxqdBa4lYTQWqZCiLG6kE=
github.com/ysmood/gson v0.7.3/go.mod h1:3Kzs5zDl21g5F/BlLTNcuAGAYLKt2lV5G8D1zF3RNmg= github.com/ysmood/gson v0.7.3/go.mod h1:3Kzs5zDl21g5F/BlLTNcuAGAYLKt2lV5G8D1zF3RNmg=
github.com/ysmood/leakless v0.9.0 h1:qxCG5VirSBvmi3uynXFkcnLMzkphdh3xx5FtrORwDCU= github.com/ysmood/leakless v0.9.0 h1:qxCG5VirSBvmi3uynXFkcnLMzkphdh3xx5FtrORwDCU=
github.com/ysmood/leakless v0.9.0/go.mod h1:R8iAXPRaG97QJwqxs74RdwzcRHT1SWCGTNqY8q0JvMQ= github.com/ysmood/leakless v0.9.0/go.mod h1:R8iAXPRaG97QJwqxs74RdwzcRHT1SWCGTNqY8q0JvMQ=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY= github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
github.com/zeromicro/go-zero v1.10.1 h1:1nM3ilvYx97GUqyaNH2IQPtfNyK7tp5JvN63c7m6QKU=
github.com/zeromicro/go-zero v1.10.1/go.mod h1:z41DXmO6gx/Se7Ow5UIwPxcUmpVj3ebhoNCcZ1gfp5k=
go.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=
go.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=
go.opentelemetry.io/otel v1.40.0 h1:oA5YeOcpRTXq6NN7frwmwFR0Cn3RhTVZvXsP4duvCms=
go.opentelemetry.io/otel v1.40.0/go.mod h1:IMb+uXZUKkMXdPddhwAHm6UfOwJyh4ct1ybIlV14J0g=
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.40.0 h1:QKdN8ly8zEMrByybbQgv8cWBcdAarwmIPZ6FThrWXJs=
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.40.0/go.mod h1:bTdK1nhqF76qiPoCCdyFIV+N/sRHYXYCTQc+3VCi3MI=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.40.0 h1:DvJDOPmSWQHWywQS6lKL+pb8s3gBLOZUtw4N+mavW1I=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.40.0/go.mod h1:EtekO9DEJb4/jRyN4v4Qjc2yA7AtfCBuz2FynRUWTXs=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.40.0 h1:wVZXIWjQSeSmMoxF74LzAnpVQOAFDo3pPji9Y4SOFKc=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.40.0/go.mod h1:khvBS2IggMFNwZK/6lEeHg/W57h/IX6J4URh57fuI40=
go.opentelemetry.io/otel/exporters/stdout/stdouttrace v1.40.0 h1:MzfofMZN8ulNqobCmCAVbqVL5syHw+eB2qPRkCMA/fQ=
go.opentelemetry.io/otel/exporters/stdout/stdouttrace v1.40.0/go.mod h1:E73G9UFtKRXrxhBsHtG00TB5WxX57lpsQzogDkqBTz8=
go.opentelemetry.io/otel/exporters/zipkin v1.40.0 h1:zu+I4j+FdO6xIxBVPeuncQVbjxUM4LiMgv6GwGe9REE=
go.opentelemetry.io/otel/exporters/zipkin v1.40.0/go.mod h1:zS6cC4nFBYXbu18e7aLfMzubBjOiN7ZcROu477qtMf8=
go.opentelemetry.io/otel/metric v1.40.0 h1:rcZe317KPftE2rstWIBitCdVp89A2HqjkxR3c11+p9g=
go.opentelemetry.io/otel/metric v1.40.0/go.mod h1:ib/crwQH7N3r5kfiBZQbwrTge743UDc7DTFVZrrXnqc=
go.opentelemetry.io/otel/sdk v1.40.0 h1:KHW/jUzgo6wsPh9At46+h4upjtccTmuZCFAc9OJ71f8=
go.opentelemetry.io/otel/sdk v1.40.0/go.mod h1:Ph7EFdYvxq72Y8Li9q8KebuYUr2KoeyHx0DRMKrYBUE=
go.opentelemetry.io/otel/sdk/metric v1.40.0 h1:mtmdVqgQkeRxHgRv4qhyJduP3fYJRMX4AtAlbuWdCYw=
go.opentelemetry.io/otel/sdk/metric v1.40.0/go.mod h1:4Z2bGMf0KSK3uRjlczMOeMhKU2rhUqdWNoKcYrtcBPg=
go.opentelemetry.io/otel/trace v1.40.0 h1:WA4etStDttCSYuhwvEa8OP8I5EWu24lkOzp+ZYblVjw=
go.opentelemetry.io/otel/trace v1.40.0/go.mod h1:zeAhriXecNGP/s2SEG3+Y8X9ujcJOTqQ5RgdEJcawiA=
go.opentelemetry.io/proto/otlp v1.9.0 h1:l706jCMITVouPOqEnii2fIAuO3IVGBRPV5ICjceRb/A=
go.opentelemetry.io/proto/otlp v1.9.0/go.mod h1:xE+Cx5E/eEHw+ISFkwPLwCZefwVjY+pqKg1qcK03+/4=
go.uber.org/automaxprocs v1.6.0 h1:O3y2/QNTOdbF+e/dpXNNW7Rx2hZ4sTIPyybbxyNqTUs=
go.uber.org/automaxprocs v1.6.0/go.mod h1:ifeIMSnPZuznNm6jmdzmU3/bfk01Fe2fotchwEFJ8r8=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
go.yaml.in/yaml/v2 v2.4.2 h1:DzmwEr2rDGHl7lsFgAHxmNz/1NlQ7xLIrlN2h5d1eGI=
go.yaml.in/yaml/v2 v2.4.2/go.mod h1:081UH+NErpNdqlCXm3TtEran0rJZGxAYx9hb/ELlsPU=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w= golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc= golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU= golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU=
@ -51,6 +155,8 @@ golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c= golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs= golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg= golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=
golang.org/x/net v0.50.0 h1:ucWh9eiCGyDR3vtzso0WMQinm2Dnt8cFMuQa9K33J60=
golang.org/x/net v0.50.0/go.mod h1:UgoSli3F/pBgdJBHCTc+tp3gmrU4XswgGRgtnwWTfyM=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@ -61,6 +167,7 @@ golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
@ -78,6 +185,8 @@ golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8= golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8= golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU= golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.34.0 h1:oL/Qq0Kdaqxa1KbNeMKwQq0reLCCaFtqu2eNuSeNHbk=
golang.org/x/text v0.34.0/go.mod h1:homfLqTYRFyVYemLBFl5GgL/DWEiH5wcsQ5gSh1yziA=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ= golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo= golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc= golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
@ -85,8 +194,30 @@ golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/tools v0.42.0 h1:uNgphsn75Tdz5Ji2q36v/nsFSfR/9BRFvqhGBaJGd5k= golang.org/x/tools v0.42.0 h1:uNgphsn75Tdz5Ji2q36v/nsFSfR/9BRFvqhGBaJGd5k=
golang.org/x/tools v0.42.0/go.mod h1:Ma6lCIwGZvHK6XtgbswSoWroEkhugApmsXyrUmBhfr0= golang.org/x/tools v0.42.0/go.mod h1:Ma6lCIwGZvHK6XtgbswSoWroEkhugApmsXyrUmBhfr0=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
google.golang.org/genproto/googleapis/api v0.0.0-20260128011058-8636f8732409 h1:merA0rdPeUV3YIIfHHcH4qBkiQAc1nfCKSI7lB4cV2M=
google.golang.org/genproto/googleapis/api v0.0.0-20260128011058-8636f8732409/go.mod h1:fl8J1IvUjCilwZzQowmw2b7HQB2eAuYBabMXzWurF+I=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260128011058-8636f8732409 h1:H86B94AW+VfJWDqFeEbBPhEtHzJwJfTbgE2lZa54ZAQ=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260128011058-8636f8732409/go.mod h1:j9x/tPzZkyxcgEFkiKEEGxfvyumM01BEtsW8xzOahRQ=
google.golang.org/grpc v1.79.3 h1:sybAEdRIEtvcD68Gx7dmnwjZKlyfuc61Dyo9pGXXkKE=
google.golang.org/grpc v1.79.3/go.mod h1:KmT0Kjez+0dde/v2j9vzwoAScgEPx/Bw1CYChhHLrHQ=
google.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE=
google.golang.org/protobuf v1.36.11/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/h2non/gock.v1 v1.1.2 h1:jBbHXgGBK/AoPVfJh5x4r/WxIrElvbLel8TCZkkZJoY=
gopkg.in/h2non/gock.v1 v1.1.2/go.mod h1:n7UGz/ckNChHiK05rDoiC4MYSunEC/lyaUm2WWaDva0=
gopkg.in/sourcemap.v1 v1.0.5 h1:inv58fC9f9J3TK2Y2R1NPntXEn3/wjWHkonhIUODNTI=
gopkg.in/sourcemap.v1 v1.0.5/go.mod h1:2RlvNNSMglmRrcvhfuzp4hQHwOtjxlbjX7UPY/GXb78=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
k8s.io/utils v0.0.0-20260319190234-28399d86e0b5 h1:kBawHLSnx/mYHmRnNUf9d4CpjREbeZuxoSGOX/J+aYM=
k8s.io/utils v0.0.0-20260319190234-28399d86e0b5/go.mod h1:xDxuJ0whA3d0I4mf/C4ppKHxXynQ+fxnkmQH0vTHnuk=
modernc.org/cc/v4 v4.27.1 h1:9W30zRlYrefrDV2JE2O8VDtJ1yPGownxciz5rrbQZis= modernc.org/cc/v4 v4.27.1 h1:9W30zRlYrefrDV2JE2O8VDtJ1yPGownxciz5rrbQZis=
modernc.org/cc/v4 v4.27.1/go.mod h1:uVtb5OGqUKpoLWhqwNQo/8LwvoiEBLvZXIQ/SmO6mL0= modernc.org/cc/v4 v4.27.1/go.mod h1:uVtb5OGqUKpoLWhqwNQo/8LwvoiEBLvZXIQ/SmO6mL0=
modernc.org/ccgo/v4 v4.32.0 h1:hjG66bI/kqIPX1b2yT6fr/jt+QedtP2fqojG2VrFuVw= modernc.org/ccgo/v4 v4.32.0 h1:hjG66bI/kqIPX1b2yT6fr/jt+QedtP2fqojG2VrFuVw=

View File

@ -1,40 +0,0 @@
package apitypes
type Message struct {
Role string
Content string
}
type Tool struct {
Type string
Function ToolFunction
}
type ToolFunction struct {
Name string
Description string
Parameters interface{}
}
type ToolCall struct {
ID string
Name string
Arguments string
}
type StreamChunk struct {
Type ChunkType
Text string
Thinking string
ToolCall *ToolCall
Done bool
}
type ChunkType int
const (
ChunkText ChunkType = iota
ChunkThinking
ChunkToolCall
ChunkDone
)

View File

@ -1,9 +1,57 @@
package config package config
import ( import (
"cursor-api-proxy/internal/env" "os"
"path/filepath"
"cursor-api-proxy/pkg/infrastructure/env"
"github.com/zeromicro/go-zero/rest"
) )
// Config for go-zero (generated by goctl)
type Config struct {
rest.RestConf
// Cursor 配置
AgentBin string
DefaultModel string
Provider string
TimeoutMs int
// 多帳號池
ConfigDirs []string
MultiPort bool
// TLS
TLSCertPath string
TLSKeyPath string
// 日誌
SessionsLogPath string
// Verbose is inherited from rest.RestConf
// Gemini
GeminiAccountDir string
GeminiBrowserVisible bool
GeminiMaxSessions int
// 工作區
Workspace string
ChatOnlyWorkspace bool
WinCmdlineMax int
// Agent
Force bool
ApproveMcps bool
MaxMode bool
StrictModel bool
// API Key
RequiredKey string
}
// BridgeConfig for backward compatibility with existing code
type BridgeConfig struct { type BridgeConfig struct {
AgentBin string AgentBin string
Host string Host string
@ -31,6 +79,58 @@ type BridgeConfig struct {
GeminiMaxSessions int GeminiMaxSessions int
} }
// ToBridgeConfig converts Config to BridgeConfig
func (c Config) ToBridgeConfig() BridgeConfig {
home := os.Getenv("HOME")
if home == "" {
home = os.Getenv("USERPROFILE")
}
configDirs := c.ConfigDirs
if len(configDirs) == 0 {
configDirs = []string{filepath.Join(home, ".cursor-api-proxy", "accounts", "default")}
} else {
for i, dir := range configDirs {
if len(dir) > 0 && dir[0] == '~' {
configDirs[i] = filepath.Join(home, dir[1:])
}
}
}
geminiDir := c.GeminiAccountDir
if geminiDir != "" && geminiDir[0] == '~' {
geminiDir = filepath.Join(home, geminiDir[1:])
}
return BridgeConfig{
AgentBin: c.AgentBin,
Host: c.Host,
Port: c.Port,
RequiredKey: c.RequiredKey,
DefaultModel: c.DefaultModel,
Mode: "ask",
Provider: c.Provider,
Force: c.Force,
ApproveMcps: c.ApproveMcps,
StrictModel: c.StrictModel,
Workspace: c.Workspace,
TimeoutMs: c.TimeoutMs,
TLSCertPath: c.TLSCertPath,
TLSKeyPath: c.TLSKeyPath,
SessionsLogPath: c.SessionsLogPath,
ChatOnlyWorkspace: c.ChatOnlyWorkspace,
Verbose: c.Verbose,
MaxMode: c.MaxMode,
ConfigDirs: configDirs,
MultiPort: c.MultiPort,
WinCmdlineMax: c.WinCmdlineMax,
GeminiAccountDir: geminiDir,
GeminiBrowserVisible: c.GeminiBrowserVisible,
GeminiMaxSessions: c.GeminiMaxSessions,
}
}
// LoadBridgeConfig loads config from environment (for backward compatibility)
func LoadBridgeConfig(e env.EnvSource, cwd string) BridgeConfig { func LoadBridgeConfig(e env.EnvSource, cwd string) BridgeConfig {
loaded := env.LoadEnvConfig(e, cwd) loaded := env.LoadEnvConfig(e, cwd)
return BridgeConfig{ return BridgeConfig{

View File

@ -1,123 +1,60 @@
package config_test package config_test
import ( import (
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/env"
"path/filepath"
"strings"
"testing" "testing"
"cursor-api-proxy/internal/config"
) )
func TestLoadBridgeConfig_Defaults(t *testing.T) { func TestConfigToBridgeConfig(t *testing.T) {
cfg := config.LoadBridgeConfig(env.EnvSource{}, "/workspace") cfg := config.Config{}
if cfg.AgentBin != "agent" { bc := cfg.ToBridgeConfig()
t.Errorf("AgentBin = %q, want %q", cfg.AgentBin, "agent")
if bc.Host != "" {
t.Errorf("Host = %q, want empty", bc.Host)
} }
if cfg.Host != "127.0.0.1" { if bc.Mode != "ask" {
t.Errorf("Host = %q, want %q", cfg.Host, "127.0.0.1") t.Errorf("Mode = %q, want ask", bc.Mode)
}
if cfg.Port != 8765 {
t.Errorf("Port = %d, want 8765", cfg.Port)
}
if cfg.RequiredKey != "" {
t.Errorf("RequiredKey = %q, want empty", cfg.RequiredKey)
}
if cfg.DefaultModel != "auto" {
t.Errorf("DefaultModel = %q, want %q", cfg.DefaultModel, "auto")
}
if cfg.Force {
t.Error("Force should be false")
}
if cfg.ApproveMcps {
t.Error("ApproveMcps should be false")
}
if !cfg.StrictModel {
t.Error("StrictModel should be true")
}
if cfg.Mode != "ask" {
t.Errorf("Mode = %q, want %q", cfg.Mode, "ask")
}
if cfg.Workspace != "/workspace" {
t.Errorf("Workspace = %q, want /workspace", cfg.Workspace)
}
if !cfg.ChatOnlyWorkspace {
t.Error("ChatOnlyWorkspace should be true")
}
if cfg.WinCmdlineMax != 30000 {
t.Errorf("WinCmdlineMax = %d, want 30000", cfg.WinCmdlineMax)
} }
} }
func TestLoadBridgeConfig_FromEnv(t *testing.T) { func TestConfigToBridgeConfigWithValues(t *testing.T) {
e := env.EnvSource{ cfg := config.Config{
"CURSOR_AGENT_BIN": "/usr/bin/agent", AgentBin: "cursor",
"CURSOR_BRIDGE_HOST": "0.0.0.0", DefaultModel: "claude-3.5-sonnet",
"CURSOR_BRIDGE_PORT": "9999", Provider: "cursor",
"CURSOR_BRIDGE_API_KEY": "sk-secret", TimeoutMs: 300000,
"CURSOR_BRIDGE_DEFAULT_MODEL": "org/claude-3-opus", Force: true,
"CURSOR_BRIDGE_FORCE": "true", ApproveMcps: true,
"CURSOR_BRIDGE_APPROVE_MCPS": "yes", StrictModel: true,
"CURSOR_BRIDGE_STRICT_MODEL": "false", Workspace: "/tmp/test",
"CURSOR_BRIDGE_WORKSPACE": "./my-workspace", ChatOnlyWorkspace: true,
"CURSOR_BRIDGE_TIMEOUT_MS": "60000", GeminiAccountDir: "/tmp/gemini",
"CURSOR_BRIDGE_CHAT_ONLY_WORKSPACE": "false", GeminiMaxSessions: 5,
"CURSOR_BRIDGE_VERBOSE": "1",
"CURSOR_BRIDGE_TLS_CERT": "./certs/test.crt",
"CURSOR_BRIDGE_TLS_KEY": "./certs/test.key",
} }
cfg := config.LoadBridgeConfig(e, "/tmp/project")
if cfg.AgentBin != "/usr/bin/agent" { bc := cfg.ToBridgeConfig()
t.Errorf("AgentBin = %q, want /usr/bin/agent", cfg.AgentBin)
if bc.AgentBin != "cursor" {
t.Errorf("AgentBin = %q, want cursor", bc.AgentBin)
} }
if cfg.Host != "0.0.0.0" { if bc.DefaultModel != "claude-3.5-sonnet" {
t.Errorf("Host = %q, want 0.0.0.0", cfg.Host) t.Errorf("DefaultModel = %q, want claude-3.5-sonnet", bc.DefaultModel)
} }
if cfg.Port != 9999 { if bc.TimeoutMs != 300000 {
t.Errorf("Port = %d, want 9999", cfg.Port) t.Errorf("TimeoutMs = %d, want 300000", bc.TimeoutMs)
} }
if cfg.RequiredKey != "sk-secret" { if !bc.Force {
t.Errorf("RequiredKey = %q, want sk-secret", cfg.RequiredKey)
}
if cfg.DefaultModel != "claude-3-opus" {
t.Errorf("DefaultModel = %q, want claude-3-opus", cfg.DefaultModel)
}
if !cfg.Force {
t.Error("Force should be true") t.Error("Force should be true")
} }
if !cfg.ApproveMcps { if !bc.ApproveMcps {
t.Error("ApproveMcps should be true") t.Error("ApproveMcps should be true")
} }
if cfg.StrictModel { if !bc.StrictModel {
t.Error("StrictModel should be false") t.Error("StrictModel should be true")
} }
if !filepath.IsAbs(cfg.Workspace) { if bc.Mode != "ask" {
t.Errorf("Workspace should be absolute, got %q", cfg.Workspace) t.Errorf("Mode = %q, want ask", bc.Mode)
}
if !strings.Contains(cfg.Workspace, "my-workspace") {
t.Errorf("Workspace %q should contain 'my-workspace'", cfg.Workspace)
}
if cfg.TimeoutMs != 60000 {
t.Errorf("TimeoutMs = %d, want 60000", cfg.TimeoutMs)
}
if cfg.ChatOnlyWorkspace {
t.Error("ChatOnlyWorkspace should be false")
}
if !cfg.Verbose {
t.Error("Verbose should be true")
}
if cfg.TLSCertPath != "/tmp/project/certs/test.crt" {
t.Errorf("TLSCertPath = %q, want /tmp/project/certs/test.crt", cfg.TLSCertPath)
}
if cfg.TLSKeyPath != "/tmp/project/certs/test.key" {
t.Errorf("TLSKeyPath = %q, want /tmp/project/certs/test.key", cfg.TLSKeyPath)
}
}
func TestLoadBridgeConfig_WideHost(t *testing.T) {
cfg := config.LoadBridgeConfig(env.EnvSource{"CURSOR_BRIDGE_HOST": "0.0.0.0"}, "/workspace")
if cfg.Host != "0.0.0.0" {
t.Errorf("Host = %q, want 0.0.0.0", cfg.Host)
} }
} }

View File

@ -0,0 +1,53 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"encoding/json"
"io"
"net/http"
"cursor-api-proxy/internal/logic/chat"
"cursor-api-proxy/internal/svc"
"cursor-api-proxy/internal/types"
"cursor-api-proxy/pkg/infrastructure/httputil"
)
func AnthropicMessagesHandler(svcCtx *svc.ServiceContext) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
// Read raw body first
rawBody, err := io.ReadAll(r.Body)
if err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"type": "invalid_request_error", "message": "failed to read body"},
}, nil)
return
}
var req types.AnthropicRequest
if err := json.Unmarshal(rawBody, &req); err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"type": "invalid_request_error", "message": "invalid JSON body"},
}, nil)
return
}
l := chat.NewAnthropicMessagesLogic(r.Context(), svcCtx)
if req.Stream {
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("Connection", "keep-alive")
w.Header().Set("X-Accel-Buffering", "no")
_ = l.AnthropicMessagesStream(&req, w, r.Method, r.URL.Path)
} else {
err := l.AnthropicMessages(&req, w, r.Method, r.URL.Path)
if err != nil {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"type": "api_error", "message": err.Error()},
}, nil)
}
}
}
}

View File

@ -0,0 +1,55 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"encoding/json"
"io"
"net/http"
"cursor-api-proxy/internal/logic/chat"
"cursor-api-proxy/internal/svc"
"cursor-api-proxy/internal/types"
"cursor-api-proxy/pkg/infrastructure/httputil"
)
func ChatCompletionsHandler(svcCtx *svc.ServiceContext) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
// Read raw body first
rawBody, err := io.ReadAll(r.Body)
if err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"message": "failed to read body", "code": "bad_request"},
}, nil)
return
}
var req types.ChatCompletionRequest
if err := json.Unmarshal(rawBody, &req); err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"message": "invalid JSON body", "code": "bad_request"},
}, nil)
return
}
l := chat.NewChatCompletionsLogic(r.Context(), svcCtx)
if req.Stream {
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("Connection", "keep-alive")
w.Header().Set("X-Accel-Buffering", "no")
_ = l.ChatCompletionsStream(&req, w, r.Method, r.URL.Path)
} else {
resp, err := l.ChatCompletions(&req)
if err != nil {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": err.Error(), "code": "internal_error"},
}, nil)
} else {
httputil.WriteJSON(w, 200, resp, nil)
}
}
}
}

View File

@ -0,0 +1,24 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"net/http"
"cursor-api-proxy/internal/logic/chat"
"cursor-api-proxy/internal/svc"
"github.com/zeromicro/go-zero/rest/httpx"
)
func HealthHandler(svcCtx *svc.ServiceContext) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
l := chat.NewHealthLogic(r.Context(), svcCtx)
resp, err := l.Health()
if err != nil {
httpx.ErrorCtx(r.Context(), w, err)
} else {
httpx.OkJsonCtx(r.Context(), w, resp)
}
}
}

View File

@ -0,0 +1,24 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"net/http"
"cursor-api-proxy/internal/logic/chat"
"cursor-api-proxy/internal/svc"
"github.com/zeromicro/go-zero/rest/httpx"
)
func ModelsHandler(svcCtx *svc.ServiceContext) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
l := chat.NewModelsLogic(r.Context(), svcCtx)
resp, err := l.Models()
if err != nil {
httpx.ErrorCtx(r.Context(), w, err)
} else {
httpx.OkJsonCtx(r.Context(), w, resp)
}
}
}

View File

@ -0,0 +1,40 @@
// Code generated by goctl. DO NOT EDIT.
// goctl 1.10.1
package handler
import (
"net/http"
chat "cursor-api-proxy/internal/handler/chat"
"cursor-api-proxy/internal/svc"
"github.com/zeromicro/go-zero/rest"
)
func RegisterHandlers(server *rest.Server, serverCtx *svc.ServiceContext) {
server.AddRoutes(
[]rest.Route{
{
Method: http.MethodGet,
Path: "/health",
Handler: chat.HealthHandler(serverCtx),
},
{
Method: http.MethodGet,
Path: "/v1/models",
Handler: chat.ModelsHandler(serverCtx),
},
{
Method: http.MethodPost,
Path: "/v1/chat/completions",
Handler: chat.ChatCompletionsHandler(serverCtx),
},
{
Method: http.MethodPost,
Path: "/v1/messages",
Handler: chat.AnthropicMessagesHandler(serverCtx),
},
},
)
}

View File

@ -1,577 +0,0 @@
package handlers
import (
"context"
"cursor-api-proxy/internal/agent"
"cursor-api-proxy/internal/anthropic"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/httputil"
"cursor-api-proxy/internal/logger"
"cursor-api-proxy/internal/models"
"cursor-api-proxy/internal/openai"
"cursor-api-proxy/internal/parser"
"cursor-api-proxy/internal/pool"
"cursor-api-proxy/internal/sanitize"
"cursor-api-proxy/internal/toolcall"
"cursor-api-proxy/internal/winlimit"
"cursor-api-proxy/internal/workspace"
"encoding/json"
"fmt"
"net/http"
"regexp"
"strings"
"time"
"github.com/google/uuid"
)
func HandleAnthropicMessages(w http.ResponseWriter, r *http.Request, cfg config.BridgeConfig, ph pool.PoolHandle, lastModelRef *string, rawBody, method, pathname, remoteAddress string) {
var req anthropic.MessagesRequest
if err := json.Unmarshal([]byte(rawBody), &req); err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"type": "invalid_request_error", "message": "invalid JSON body"},
}, nil)
return
}
requested := openai.NormalizeModelID(req.Model)
model := ResolveModel(requested, lastModelRef, cfg)
var rawMap map[string]interface{}
_ = json.Unmarshal([]byte(rawBody), &rawMap)
cleanSystem := sanitize.SanitizeSystem(req.System)
rawMessages := make([]interface{}, len(req.Messages))
for i, m := range req.Messages {
rawMessages[i] = map[string]interface{}{"role": m.Role, "content": m.Content}
}
cleanRawMessages := sanitize.SanitizeMessages(rawMessages)
var cleanMessages []anthropic.MessageParam
for _, raw := range cleanRawMessages {
if m, ok := raw.(map[string]interface{}); ok {
role, _ := m["role"].(string)
cleanMessages = append(cleanMessages, anthropic.MessageParam{Role: role, Content: m["content"]})
}
}
toolsText := openai.ToolsToSystemText(req.Tools, nil)
var systemWithTools interface{}
if toolsText != "" {
sysStr := ""
switch v := cleanSystem.(type) {
case string:
sysStr = v
}
if sysStr != "" {
systemWithTools = sysStr + "\n\n" + toolsText
} else {
systemWithTools = toolsText
}
} else {
systemWithTools = cleanSystem
}
prompt := anthropic.BuildPromptFromAnthropicMessages(cleanMessages, systemWithTools)
if req.MaxTokens == 0 {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"type": "invalid_request_error", "message": "max_tokens is required"},
}, nil)
return
}
cursorModel := models.ResolveToCursorModel(model)
if cursorModel == "" {
cursorModel = model
}
var trafficMsgs []logger.TrafficMessage
if s := systemToString(cleanSystem); s != "" {
trafficMsgs = append(trafficMsgs, logger.TrafficMessage{Role: "system", Content: s})
}
for _, m := range cleanMessages {
text := contentToString(m.Content)
if text != "" {
trafficMsgs = append(trafficMsgs, logger.TrafficMessage{Role: m.Role, Content: text})
}
}
logger.LogTrafficRequest(cfg.Verbose, model, trafficMsgs, req.Stream)
headerWs := r.Header.Get("x-cursor-workspace")
ws := workspace.ResolveWorkspace(cfg, headerWs)
fixedArgs := agent.BuildAgentFixedArgs(cfg, ws.WorkspaceDir, cursorModel, req.Stream)
fit := winlimit.FitPromptToWinCmdline(cfg.AgentBin, fixedArgs, prompt, cfg.WinCmdlineMax, ws.WorkspaceDir)
if cfg.Verbose {
if len(prompt) > 200 {
logger.LogDebug("model=%s prompt_len=%d prompt_preview=%q", cursorModel, len(prompt), prompt[:200]+"...")
} else {
logger.LogDebug("model=%s prompt_len=%d prompt=%q", cursorModel, len(prompt), prompt)
}
logger.LogDebug("cmd_args=%v", fit.Args)
}
if !fit.OK {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"type": "api_error", "message": fit.Error},
}, nil)
return
}
if fit.Truncated {
logger.LogTruncation(fit.OriginalLength, fit.FinalPromptLength)
}
cmdArgs := fit.Args
msgID := "msg_" + uuid.New().String()
var truncatedHeaders map[string]string
if fit.Truncated {
truncatedHeaders = map[string]string{"X-Cursor-Proxy-Prompt-Truncated": "true"}
}
hasTools := len(req.Tools) > 0
var toolNames map[string]bool
if hasTools {
toolNames = toolcall.CollectToolNames(req.Tools)
}
if req.Stream {
httputil.WriteSSEHeaders(w, truncatedHeaders)
flusher, _ := w.(http.Flusher)
writeEvent := func(evt interface{}) {
data, _ := json.Marshal(evt)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
}
var accumulated string
var accumulatedThinking string
var chunkNum int
var p parser.Parser
writeEvent(map[string]interface{}{
"type": "message_start",
"message": map[string]interface{}{
"id": msgID,
"type": "message",
"role": "assistant",
"model": model,
"content": []interface{}{},
},
})
if hasTools {
toolCallMarkerRe := regexp.MustCompile(`行政法规|<function_calls>`)
var toolCallMode bool
textBlockOpen := false
textBlockIndex := 0
thinkingOpen := false
thinkingBlockIndex := 0
blockCount := 0
p = parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
chunkNum++
logger.LogStreamChunk(model, text, chunkNum)
if toolCallMode {
return
}
if toolCallMarkerRe.MatchString(text) {
if textBlockOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": textBlockIndex})
textBlockOpen = false
}
if thinkingOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": thinkingBlockIndex})
thinkingOpen = false
}
toolCallMode = true
return
}
if !textBlockOpen && !thinkingOpen {
textBlockIndex = blockCount
writeEvent(map[string]interface{}{
"type": "content_block_start",
"index": textBlockIndex,
"content_block": map[string]string{"type": "text", "text": ""},
})
textBlockOpen = true
blockCount++
}
if thinkingOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": thinkingBlockIndex})
thinkingOpen = false
}
writeEvent(map[string]interface{}{
"type": "content_block_delta",
"index": textBlockIndex,
"delta": map[string]string{"type": "text_delta", "text": text},
})
},
func(thinking string) {
accumulatedThinking += thinking
chunkNum++
if toolCallMode {
return
}
if !thinkingOpen {
thinkingBlockIndex = blockCount
writeEvent(map[string]interface{}{
"type": "content_block_start",
"index": thinkingBlockIndex,
"content_block": map[string]string{"type": "thinking", "thinking": ""},
})
thinkingOpen = true
blockCount++
}
writeEvent(map[string]interface{}{
"type": "content_block_delta",
"index": thinkingBlockIndex,
"delta": map[string]string{"type": "thinking_delta", "thinking": thinking},
})
},
func() {
logger.LogTrafficResponse(cfg.Verbose, model, accumulated, true)
parsed := toolcall.ExtractToolCalls(accumulated, toolNames)
blockIndex := 0
if thinkingOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": thinkingBlockIndex})
thinkingOpen = false
}
if parsed.HasToolCalls() {
if textBlockOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": textBlockIndex})
blockIndex = textBlockIndex + 1
}
if parsed.TextContent != "" && !textBlockOpen && !toolCallMode {
writeEvent(map[string]interface{}{
"type": "content_block_start", "index": blockIndex,
"content_block": map[string]string{"type": "text", "text": ""},
})
writeEvent(map[string]interface{}{
"type": "content_block_delta", "index": blockIndex,
"delta": map[string]string{"type": "text_delta", "text": parsed.TextContent},
})
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockIndex})
blockIndex++
}
for _, tc := range parsed.ToolCalls {
toolID := "toolu_" + uuid.New().String()[:12]
var inputObj interface{}
_ = json.Unmarshal([]byte(tc.Arguments), &inputObj)
if inputObj == nil {
inputObj = map[string]interface{}{}
}
writeEvent(map[string]interface{}{
"type": "content_block_start", "index": blockIndex,
"content_block": map[string]interface{}{
"type": "tool_use", "id": toolID, "name": tc.Name, "input": map[string]interface{}{},
},
})
writeEvent(map[string]interface{}{
"type": "content_block_delta", "index": blockIndex,
"delta": map[string]interface{}{
"type": "input_json_delta", "partial_json": tc.Arguments,
},
})
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockIndex})
blockIndex++
}
writeEvent(map[string]interface{}{
"type": "message_delta",
"delta": map[string]interface{}{"stop_reason": "tool_use", "stop_sequence": nil},
"usage": map[string]int{"output_tokens": 0},
})
writeEvent(map[string]interface{}{"type": "message_stop"})
} else {
if textBlockOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": textBlockIndex})
} else if accumulated != "" {
writeEvent(map[string]interface{}{
"type": "content_block_start", "index": blockIndex,
"content_block": map[string]string{"type": "text", "text": ""},
})
writeEvent(map[string]interface{}{
"type": "content_block_delta", "index": blockIndex,
"delta": map[string]string{"type": "text_delta", "text": accumulated},
})
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockIndex})
blockIndex++
} else {
writeEvent(map[string]interface{}{
"type": "content_block_start", "index": blockIndex,
"content_block": map[string]string{"type": "text", "text": ""},
})
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockIndex})
blockIndex++
}
writeEvent(map[string]interface{}{
"type": "message_delta",
"delta": map[string]interface{}{"stop_reason": "end_turn", "stop_sequence": nil},
"usage": map[string]int{"output_tokens": 0},
})
writeEvent(map[string]interface{}{"type": "message_stop"})
}
},
)
} else {
// 非 tools 模式:即時串流 thinking 和 text
blockCount := 0
thinkingOpen := false
textOpen := false
p = parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
chunkNum++
logger.LogStreamChunk(model, text, chunkNum)
if thinkingOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockCount - 1})
thinkingOpen = false
}
if !textOpen {
writeEvent(map[string]interface{}{
"type": "content_block_start",
"index": blockCount,
"content_block": map[string]string{"type": "text", "text": ""},
})
textOpen = true
blockCount++
}
writeEvent(map[string]interface{}{
"type": "content_block_delta",
"index": blockCount - 1,
"delta": map[string]string{"type": "text_delta", "text": text},
})
},
func(thinking string) {
accumulatedThinking += thinking
chunkNum++
if !thinkingOpen {
writeEvent(map[string]interface{}{
"type": "content_block_start",
"index": blockCount,
"content_block": map[string]string{"type": "thinking", "thinking": ""},
})
thinkingOpen = true
blockCount++
}
writeEvent(map[string]interface{}{
"type": "content_block_delta",
"index": blockCount - 1,
"delta": map[string]string{"type": "thinking_delta", "thinking": thinking},
})
},
func() {
logger.LogTrafficResponse(cfg.Verbose, model, accumulated, true)
if thinkingOpen {
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockCount - 1})
thinkingOpen = false
}
if !textOpen {
writeEvent(map[string]interface{}{
"type": "content_block_start",
"index": blockCount,
"content_block": map[string]string{"type": "text", "text": ""},
})
blockCount++
}
writeEvent(map[string]interface{}{"type": "content_block_stop", "index": blockCount - 1})
writeEvent(map[string]interface{}{
"type": "message_delta",
"delta": map[string]interface{}{"stop_reason": "end_turn", "stop_sequence": nil},
"usage": map[string]int{"output_tokens": 0},
})
writeEvent(map[string]interface{}{"type": "message_stop"})
},
)
}
configDir := ph.GetNextConfigDir()
logger.LogAccountAssigned(configDir)
ph.ReportRequestStart(configDir)
logger.LogRequestStart(method, pathname, model, cfg.TimeoutMs, true)
streamStart := time.Now().UnixMilli()
ctx := r.Context()
wrappedParser := func(line string) {
logger.LogRawLine(line)
p.Parse(line)
}
result, err := agent.RunAgentStreamWithContext(cfg, ws.WorkspaceDir, cmdArgs, wrappedParser, ws.TempDir, configDir, ctx)
if ctx.Err() == nil {
p.Flush()
}
latencyMs := time.Now().UnixMilli() - streamStart
ph.ReportRequestEnd(configDir)
if ctx.Err() == context.DeadlineExceeded {
logger.LogRequestTimeout(method, pathname, model, cfg.TimeoutMs)
} else if ctx.Err() == context.Canceled {
logger.LogClientDisconnect(method, pathname, model, latencyMs)
} else if err == nil && isRateLimited(result.Stderr) {
ph.ReportRateLimit(configDir, extractRetryAfterMs(result.Stderr))
}
if err != nil || (result.Code != 0 && ctx.Err() == nil) {
ph.ReportRequestError(configDir, latencyMs)
if err != nil {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, -1, err.Error())
} else {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, result.Code, result.Stderr)
}
logger.LogRequestDone(method, pathname, model, latencyMs, result.Code)
} else if ctx.Err() == nil {
ph.ReportRequestSuccess(configDir, latencyMs)
logger.LogRequestDone(method, pathname, model, latencyMs, 0)
}
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
return
}
configDir := ph.GetNextConfigDir()
logger.LogAccountAssigned(configDir)
ph.ReportRequestStart(configDir)
logger.LogRequestStart(method, pathname, model, cfg.TimeoutMs, false)
syncStart := time.Now().UnixMilli()
out, err := agent.RunAgentSync(cfg, ws.WorkspaceDir, cmdArgs, ws.TempDir, configDir, r.Context())
syncLatency := time.Now().UnixMilli() - syncStart
ph.ReportRequestEnd(configDir)
ctx := r.Context()
if ctx.Err() == context.DeadlineExceeded {
logger.LogRequestTimeout(method, pathname, model, cfg.TimeoutMs)
httputil.WriteJSON(w, 504, map[string]interface{}{
"error": map[string]string{"type": "api_error", "message": fmt.Sprintf("request timed out after %dms", cfg.TimeoutMs)},
}, nil)
return
}
if ctx.Err() == context.Canceled {
logger.LogClientDisconnect(method, pathname, model, syncLatency)
return
}
if err != nil {
ph.ReportRequestError(configDir, syncLatency)
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
logger.LogRequestDone(method, pathname, model, syncLatency, -1)
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"type": "api_error", "message": err.Error()},
}, nil)
return
}
if isRateLimited(out.Stderr) {
ph.ReportRateLimit(configDir, extractRetryAfterMs(out.Stderr))
}
if out.Code != 0 {
ph.ReportRequestError(configDir, syncLatency)
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
errMsg := logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, out.Code, out.Stderr)
logger.LogRequestDone(method, pathname, model, syncLatency, out.Code)
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"type": "api_error", "message": errMsg},
}, nil)
return
}
ph.ReportRequestSuccess(configDir, syncLatency)
content := strings.TrimSpace(out.Stdout)
logger.LogTrafficResponse(cfg.Verbose, model, content, false)
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
logger.LogRequestDone(method, pathname, model, syncLatency, 0)
if hasTools {
parsed := toolcall.ExtractToolCalls(content, toolNames)
if parsed.HasToolCalls() {
var contentBlocks []map[string]interface{}
if parsed.TextContent != "" {
contentBlocks = append(contentBlocks, map[string]interface{}{
"type": "text", "text": parsed.TextContent,
})
}
for _, tc := range parsed.ToolCalls {
toolID := "toolu_" + uuid.New().String()[:12]
var inputObj interface{}
_ = json.Unmarshal([]byte(tc.Arguments), &inputObj)
if inputObj == nil {
inputObj = map[string]interface{}{}
}
contentBlocks = append(contentBlocks, map[string]interface{}{
"type": "tool_use", "id": toolID, "name": tc.Name, "input": inputObj,
})
}
httputil.WriteJSON(w, 200, map[string]interface{}{
"id": msgID,
"type": "message",
"role": "assistant",
"content": contentBlocks,
"model": model,
"stop_reason": "tool_use",
"usage": map[string]int{"input_tokens": 0, "output_tokens": 0},
}, truncatedHeaders)
return
}
}
httputil.WriteJSON(w, 200, map[string]interface{}{
"id": msgID,
"type": "message",
"role": "assistant",
"content": []map[string]string{{"type": "text", "text": content}},
"model": model,
"stop_reason": "end_turn",
"usage": map[string]int{"input_tokens": 0, "output_tokens": 0},
}, truncatedHeaders)
}
func systemToString(system interface{}) string {
switch v := system.(type) {
case string:
return v
case []interface{}:
result := ""
for _, p := range v {
if m, ok := p.(map[string]interface{}); ok && m["type"] == "text" {
if t, ok := m["text"].(string); ok {
result += t
}
}
}
return result
}
return ""
}
func contentToString(content interface{}) string {
switch v := content.(type) {
case string:
return v
case []interface{}:
result := ""
for _, p := range v {
if m, ok := p.(map[string]interface{}); ok && m["type"] == "text" {
if t, ok := m["text"].(string); ok {
result += t
}
}
}
return result
}
return ""
}

View File

@ -1,471 +0,0 @@
package handlers
import (
"context"
"cursor-api-proxy/internal/agent"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/httputil"
"cursor-api-proxy/internal/logger"
"cursor-api-proxy/internal/models"
"cursor-api-proxy/internal/openai"
"cursor-api-proxy/internal/parser"
"cursor-api-proxy/internal/pool"
"cursor-api-proxy/internal/sanitize"
"cursor-api-proxy/internal/toolcall"
"cursor-api-proxy/internal/winlimit"
"cursor-api-proxy/internal/workspace"
"encoding/json"
"fmt"
"net/http"
"regexp"
"strconv"
"strings"
"time"
"github.com/google/uuid"
)
var rateLimitRe = regexp.MustCompile(`(?i)\b429\b|rate.?limit|too many requests`)
var retryAfterRe = regexp.MustCompile(`(?i)retry-after:\s*(\d+)`)
func isRateLimited(stderr string) bool {
return rateLimitRe.MatchString(stderr)
}
func extractRetryAfterMs(stderr string) int64 {
if m := retryAfterRe.FindStringSubmatch(stderr); len(m) > 1 {
if secs, err := strconv.ParseInt(m[1], 10, 64); err == nil && secs > 0 {
return secs * 1000
}
}
return 60000
}
func HandleChatCompletions(w http.ResponseWriter, r *http.Request, cfg config.BridgeConfig, ph pool.PoolHandle, lastModelRef *string, rawBody, method, pathname, remoteAddress string) {
var bodyMap map[string]interface{}
if err := json.Unmarshal([]byte(rawBody), &bodyMap); err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"message": "invalid JSON body", "code": "bad_request"},
}, nil)
return
}
rawModel, _ := bodyMap["model"].(string)
requested := openai.NormalizeModelID(rawModel)
model := ResolveModel(requested, lastModelRef, cfg)
cursorModel := models.ResolveToCursorModel(model)
if cursorModel == "" {
cursorModel = model
}
var messages []interface{}
if m, ok := bodyMap["messages"].([]interface{}); ok {
messages = m
}
var tools []interface{}
if t, ok := bodyMap["tools"].([]interface{}); ok {
tools = t
}
var funcs []interface{}
if f, ok := bodyMap["functions"].([]interface{}); ok {
funcs = f
}
cleanMessages := sanitize.SanitizeMessages(messages)
toolsText := openai.ToolsToSystemText(tools, funcs)
messagesWithTools := cleanMessages
if toolsText != "" {
messagesWithTools = append([]interface{}{map[string]interface{}{"role": "system", "content": toolsText}}, cleanMessages...)
}
prompt := openai.BuildPromptFromMessages(messagesWithTools)
var trafficMsgs []logger.TrafficMessage
for _, raw := range cleanMessages {
if m, ok := raw.(map[string]interface{}); ok {
role, _ := m["role"].(string)
content := openai.MessageContentToText(m["content"])
trafficMsgs = append(trafficMsgs, logger.TrafficMessage{Role: role, Content: content})
}
}
isStream := false
if s, ok := bodyMap["stream"].(bool); ok {
isStream = s
}
logger.LogTrafficRequest(cfg.Verbose, model, trafficMsgs, isStream)
headerWs := r.Header.Get("x-cursor-workspace")
ws := workspace.ResolveWorkspace(cfg, headerWs)
promptLen := len(prompt)
if cfg.Verbose {
if promptLen > 200 {
logger.LogDebug("model=%s prompt_len=%d prompt_start=%q", cursorModel, promptLen, prompt[:200])
} else {
logger.LogDebug("model=%s prompt_len=%d prompt=%q", cursorModel, promptLen, prompt)
}
}
fixedArgs := agent.BuildAgentFixedArgs(cfg, ws.WorkspaceDir, cursorModel, isStream)
fit := winlimit.FitPromptToWinCmdline(cfg.AgentBin, fixedArgs, prompt, cfg.WinCmdlineMax, ws.WorkspaceDir)
if cfg.Verbose {
logger.LogDebug("cmd=%s args=%v", cfg.AgentBin, fit.Args)
}
if !fit.OK {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": fit.Error, "code": "windows_cmdline_limit"},
}, nil)
return
}
if fit.Truncated {
logger.LogTruncation(fit.OriginalLength, fit.FinalPromptLength)
}
cmdArgs := fit.Args
id := "chatcmpl_" + uuid.New().String()
created := time.Now().Unix()
var truncatedHeaders map[string]string
if fit.Truncated {
truncatedHeaders = map[string]string{"X-Cursor-Proxy-Prompt-Truncated": "true"}
}
hasTools := len(tools) > 0 || len(funcs) > 0
var toolNames map[string]bool
if hasTools {
toolNames = toolcall.CollectToolNames(tools)
for _, f := range funcs {
if fm, ok := f.(map[string]interface{}); ok {
if name, ok := fm["name"].(string); ok {
toolNames[name] = true
}
}
}
}
if isStream {
httputil.WriteSSEHeaders(w, truncatedHeaders)
flusher, _ := w.(http.Flusher)
var accumulated string
var chunkNum int
var p parser.Parser
// toolCallMarkerRe 偵測 tool call 開頭標記,一旦出現就停止即時輸出並進入累積模式
toolCallMarkerRe := regexp.MustCompile(`<tool_call>|<function_calls>`)
if hasTools {
var toolCallMode bool // 是否已進入 tool call 累積模式
p = parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
chunkNum++
logger.LogStreamChunk(model, text, chunkNum)
if toolCallMode {
// 已進入累積模式,不即時輸出
return
}
if toolCallMarkerRe.MatchString(text) {
// 偵測到 tool call 標記,切換為累積模式
toolCallMode = true
return
}
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]string{"content": text}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func(_ string) {}, // thinking ignored in tools mode
func() {
logger.LogTrafficResponse(cfg.Verbose, model, accumulated, true)
parsed := toolcall.ExtractToolCalls(accumulated, toolNames)
if parsed.HasToolCalls() {
if parsed.TextContent != "" && toolCallMode {
// 已有部分 text 被即時輸出,只補發剩餘的
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{"role": "assistant", "content": parsed.TextContent}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
}
for i, tc := range parsed.ToolCalls {
callID := "call_" + uuid.New().String()[:8]
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{
"tool_calls": []map[string]interface{}{
{
"index": i,
"id": callID,
"type": "function",
"function": map[string]interface{}{
"name": tc.Name,
"arguments": tc.Arguments,
},
},
},
}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
}
stopChunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{}, "finish_reason": "tool_calls"},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
} else {
stopChunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{}, "finish_reason": "stop"},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
}
},
)
} else {
p = parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
chunkNum++
logger.LogStreamChunk(model, text, chunkNum)
chunk := map[string]interface{}{
"id": id,
"object": "chat.completion.chunk",
"created": created,
"model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]string{"content": text}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func(thinking string) {
chunk := map[string]interface{}{
"id": id,
"object": "chat.completion.chunk",
"created": created,
"model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{"reasoning_content": thinking}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func() {
logger.LogTrafficResponse(cfg.Verbose, model, accumulated, true)
stopChunk := map[string]interface{}{
"id": id,
"object": "chat.completion.chunk",
"created": created,
"model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{}, "finish_reason": "stop"},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
},
)
}
configDir := ph.GetNextConfigDir()
logger.LogAccountAssigned(configDir)
ph.ReportRequestStart(configDir)
logger.LogRequestStart(method, pathname, model, cfg.TimeoutMs, true)
streamStart := time.Now().UnixMilli()
ctx := r.Context()
wrappedParser := func(line string) {
logger.LogRawLine(line)
p.Parse(line)
}
result, err := agent.RunAgentStreamWithContext(cfg, ws.WorkspaceDir, cmdArgs, wrappedParser, ws.TempDir, configDir, ctx)
// agent 結束後,若未收到 result/success 訊號,強制 flush 以確保 SSE stream 正確結尾
if ctx.Err() == nil {
p.Flush()
}
latencyMs := time.Now().UnixMilli() - streamStart
ph.ReportRequestEnd(configDir)
if ctx.Err() == context.DeadlineExceeded {
logger.LogRequestTimeout(method, pathname, model, cfg.TimeoutMs)
} else if ctx.Err() == context.Canceled {
logger.LogClientDisconnect(method, pathname, model, latencyMs)
} else if err == nil && isRateLimited(result.Stderr) {
ph.ReportRateLimit(configDir, extractRetryAfterMs(result.Stderr))
}
if err != nil || (result.Code != 0 && ctx.Err() == nil) {
ph.ReportRequestError(configDir, latencyMs)
if err != nil {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, -1, err.Error())
} else {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, result.Code, result.Stderr)
}
logger.LogRequestDone(method, pathname, model, latencyMs, result.Code)
} else if ctx.Err() == nil {
ph.ReportRequestSuccess(configDir, latencyMs)
logger.LogRequestDone(method, pathname, model, latencyMs, 0)
}
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
return
}
configDir := ph.GetNextConfigDir()
logger.LogAccountAssigned(configDir)
ph.ReportRequestStart(configDir)
logger.LogRequestStart(method, pathname, model, cfg.TimeoutMs, false)
syncStart := time.Now().UnixMilli()
out, err := agent.RunAgentSync(cfg, ws.WorkspaceDir, cmdArgs, ws.TempDir, configDir, r.Context())
syncLatency := time.Now().UnixMilli() - syncStart
ph.ReportRequestEnd(configDir)
ctx := r.Context()
if ctx.Err() == context.DeadlineExceeded {
logger.LogRequestTimeout(method, pathname, model, cfg.TimeoutMs)
httputil.WriteJSON(w, 504, map[string]interface{}{
"error": map[string]string{"message": fmt.Sprintf("request timed out after %dms", cfg.TimeoutMs), "code": "timeout"},
}, nil)
return
}
if ctx.Err() == context.Canceled {
logger.LogClientDisconnect(method, pathname, model, syncLatency)
return
}
if err != nil {
ph.ReportRequestError(configDir, syncLatency)
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
logger.LogRequestDone(method, pathname, model, syncLatency, -1)
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": err.Error(), "code": "cursor_cli_error"},
}, nil)
return
}
if isRateLimited(out.Stderr) {
ph.ReportRateLimit(configDir, extractRetryAfterMs(out.Stderr))
}
if out.Code != 0 {
ph.ReportRequestError(configDir, syncLatency)
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
errMsg := logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, out.Code, out.Stderr)
logger.LogRequestDone(method, pathname, model, syncLatency, out.Code)
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": errMsg, "code": "cursor_cli_error"},
}, nil)
return
}
ph.ReportRequestSuccess(configDir, syncLatency)
content := strings.TrimSpace(out.Stdout)
logger.LogTrafficResponse(cfg.Verbose, model, content, false)
logger.LogAccountStats(cfg.Verbose, ph.GetStats())
logger.LogRequestDone(method, pathname, model, syncLatency, 0)
if hasTools {
parsed := toolcall.ExtractToolCalls(content, toolNames)
if parsed.HasToolCalls() {
msg := map[string]interface{}{"role": "assistant"}
if parsed.TextContent != "" {
msg["content"] = parsed.TextContent
} else {
msg["content"] = nil
}
var tcArr []map[string]interface{}
for _, tc := range parsed.ToolCalls {
callID := "call_" + uuid.New().String()[:8]
tcArr = append(tcArr, map[string]interface{}{
"id": callID,
"type": "function",
"function": map[string]interface{}{
"name": tc.Name,
"arguments": tc.Arguments,
},
})
}
msg["tool_calls"] = tcArr
httputil.WriteJSON(w, 200, map[string]interface{}{
"id": id,
"object": "chat.completion",
"created": created,
"model": model,
"choices": []map[string]interface{}{
{"index": 0, "message": msg, "finish_reason": "tool_calls"},
},
"usage": map[string]int{"prompt_tokens": 0, "completion_tokens": 0, "total_tokens": 0},
}, truncatedHeaders)
return
}
}
httputil.WriteJSON(w, 200, map[string]interface{}{
"id": id,
"object": "chat.completion",
"created": created,
"model": model,
"choices": []map[string]interface{}{
{
"index": 0,
"message": map[string]string{"role": "assistant", "content": content},
"finish_reason": "stop",
},
},
"usage": map[string]int{"prompt_tokens": 0, "completion_tokens": 0, "total_tokens": 0},
}, truncatedHeaders)
}

View File

@ -1,203 +0,0 @@
package handlers
import (
"context"
"cursor-api-proxy/internal/apitypes"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/httputil"
"cursor-api-proxy/internal/logger"
"cursor-api-proxy/internal/providers/geminiweb"
"encoding/json"
"fmt"
"net/http"
"time"
"github.com/google/uuid"
)
func HandleGeminiChatCompletions(w http.ResponseWriter, r *http.Request, cfg config.BridgeConfig, rawBody, method, pathname, remoteAddress string) {
_ = context.Background() // 確保 context 被使用
var bodyMap map[string]interface{}
if err := json.Unmarshal([]byte(rawBody), &bodyMap); err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"message": "invalid JSON body", "code": "bad_request"},
}, nil)
return
}
rawModel, _ := bodyMap["model"].(string)
if rawModel == "" {
rawModel = "gemini-2.0-flash"
}
var messages []interface{}
if m, ok := bodyMap["messages"].([]interface{}); ok {
messages = m
}
isStream := false
if s, ok := bodyMap["stream"].(bool); ok {
isStream = s
}
// 轉換 messages 為 apitypes.Message
var apiMessages []apitypes.Message
for _, m := range messages {
if msgMap, ok := m.(map[string]interface{}); ok {
role, _ := msgMap["role"].(string)
content := ""
if c, ok := msgMap["content"].(string); ok {
content = c
}
apiMessages = append(apiMessages, apitypes.Message{
Role: role,
Content: content,
})
}
}
logger.LogRequestStart(method, pathname, rawModel, cfg.TimeoutMs, isStream)
start := time.Now().UnixMilli()
// 創建 Gemini provider (使用 Playwright)
provider, provErr := geminiweb.NewPlaywrightProvider(cfg)
if provErr != nil {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, -1, provErr.Error())
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": provErr.Error(), "code": "provider_error"},
}, nil)
return
}
if isStream {
httputil.WriteSSEHeaders(w, nil)
flusher, _ := w.(http.Flusher)
id := "chatcmpl_" + uuid.New().String()
created := time.Now().Unix()
var accumulated string
err := provider.Generate(r.Context(), rawModel, apiMessages, nil, func(chunk apitypes.StreamChunk) {
if chunk.Type == apitypes.ChunkText {
accumulated += chunk.Text
respChunk := map[string]interface{}{
"id": id,
"object": "chat.completion.chunk",
"created": created,
"model": rawModel,
"choices": []map[string]interface{}{
{
"index": 0,
"delta": map[string]string{"content": chunk.Text},
"finish_reason": nil,
},
},
}
data, _ := json.Marshal(respChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
} else if chunk.Type == apitypes.ChunkThinking {
respChunk := map[string]interface{}{
"id": id,
"object": "chat.completion.chunk",
"created": created,
"model": rawModel,
"choices": []map[string]interface{}{
{
"index": 0,
"delta": map[string]interface{}{"reasoning_content": chunk.Thinking},
"finish_reason": nil,
},
},
}
data, _ := json.Marshal(respChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
} else if chunk.Type == apitypes.ChunkDone {
stopChunk := map[string]interface{}{
"id": id,
"object": "chat.completion.chunk",
"created": created,
"model": rawModel,
"choices": []map[string]interface{}{
{
"index": 0,
"delta": map[string]interface{}{},
"finish_reason": "stop",
},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
}
})
latencyMs := time.Now().UnixMilli() - start
if err != nil {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, -1, err.Error())
logger.LogRequestDone(method, pathname, rawModel, latencyMs, -1)
return
}
logger.LogTrafficResponse(cfg.Verbose, rawModel, accumulated, true)
logger.LogRequestDone(method, pathname, rawModel, latencyMs, 0)
return
}
// 非串流模式
var resultText string
var resultThinking string
err := provider.Generate(r.Context(), rawModel, apiMessages, nil, func(chunk apitypes.StreamChunk) {
if chunk.Type == apitypes.ChunkText {
resultText += chunk.Text
} else if chunk.Type == apitypes.ChunkThinking {
resultThinking += chunk.Thinking
}
})
latencyMs := time.Now().UnixMilli() - start
if err != nil {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, remoteAddress, -1, err.Error())
logger.LogRequestDone(method, pathname, rawModel, latencyMs, -1)
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": err.Error(), "code": "gemini_error"},
}, nil)
return
}
logger.LogTrafficResponse(cfg.Verbose, rawModel, resultText, false)
logger.LogRequestDone(method, pathname, rawModel, latencyMs, 0)
id := "chatcmpl_" + uuid.New().String()
created := time.Now().Unix()
resp := map[string]interface{}{
"id": id,
"object": "chat.completion",
"created": created,
"model": rawModel,
"choices": []map[string]interface{}{
{
"index": 0,
"message": map[string]interface{}{
"role": "assistant",
"content": resultText,
},
"finish_reason": "stop",
},
},
"usage": map[string]int{"prompt_tokens": 0, "completion_tokens": 0, "total_tokens": 0},
}
httputil.WriteJSON(w, 200, resp, nil)
}

View File

@ -1,20 +0,0 @@
package handlers
import (
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/httputil"
"net/http"
)
func HandleHealth(w http.ResponseWriter, r *http.Request, version string, cfg config.BridgeConfig) {
httputil.WriteJSON(w, 200, map[string]interface{}{
"ok": true,
"version": version,
"workspace": cfg.Workspace,
"mode": cfg.Mode,
"defaultModel": cfg.DefaultModel,
"force": cfg.Force,
"approveMcps": cfg.ApproveMcps,
"strictModel": cfg.StrictModel,
}, nil)
}

View File

@ -1,107 +0,0 @@
package handlers
import (
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/httputil"
"cursor-api-proxy/internal/models"
"net/http"
"sync"
"time"
)
const modelCacheTTLMs = 5 * 60 * 1000
type ModelCache struct {
At int64
Models []models.CursorCliModel
}
type ModelCacheRef struct {
mu sync.Mutex
cache *ModelCache
inflight bool
waiters []chan struct{}
}
func (ref *ModelCacheRef) HandleModels(w http.ResponseWriter, r *http.Request, cfg config.BridgeConfig) {
now := time.Now().UnixMilli()
ref.mu.Lock()
if ref.cache != nil && now-ref.cache.At <= modelCacheTTLMs {
cache := ref.cache
ref.mu.Unlock()
writeModels(w, cache.Models)
return
}
if ref.inflight {
// Wait for the in-flight fetch
ch := make(chan struct{}, 1)
ref.waiters = append(ref.waiters, ch)
ref.mu.Unlock()
<-ch
ref.mu.Lock()
cache := ref.cache
ref.mu.Unlock()
writeModels(w, cache.Models)
return
}
ref.inflight = true
ref.mu.Unlock()
fetched, err := models.ListCursorCliModels(cfg.AgentBin, 60000)
ref.mu.Lock()
ref.inflight = false
if err == nil {
ref.cache = &ModelCache{At: time.Now().UnixMilli(), Models: fetched}
}
waiters := ref.waiters
ref.waiters = nil
ref.mu.Unlock()
for _, ch := range waiters {
ch <- struct{}{}
}
if err != nil {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": err.Error(), "code": "models_fetch_error"},
}, nil)
return
}
writeModels(w, fetched)
}
func writeModels(w http.ResponseWriter, mods []models.CursorCliModel) {
cursorModels := make([]map[string]interface{}, len(mods))
for i, m := range mods {
cursorModels[i] = map[string]interface{}{
"id": m.ID,
"object": "model",
"owned_by": "cursor",
"name": m.Name,
}
}
ids := make([]string, len(mods))
for i, m := range mods {
ids[i] = m.ID
}
aliases := models.GetAnthropicModelAliases(ids)
for _, a := range aliases {
cursorModels = append(cursorModels, map[string]interface{}{
"id": a.ID,
"object": "model",
"owned_by": "cursor",
"name": a.Name,
})
}
httputil.WriteJSON(w, 200, map[string]interface{}{
"object": "list",
"data": cursorModels,
}, nil)
}

View File

@ -1,27 +0,0 @@
package handlers
import "cursor-api-proxy/internal/config"
func ResolveModel(requested string, lastModelRef *string, cfg config.BridgeConfig) string {
isAuto := requested == "auto"
var explicitModel string
if requested != "" && !isAuto {
explicitModel = requested
}
if explicitModel != "" {
*lastModelRef = explicitModel
}
if isAuto {
return "auto"
}
if explicitModel != "" {
return explicitModel
}
if cfg.StrictModel && *lastModelRef != "" {
return *lastModelRef
}
if *lastModelRef != "" {
return *lastModelRef
}
return cfg.DefaultModel
}

View File

@ -0,0 +1,459 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"context"
"encoding/json"
"fmt"
"net/http"
"regexp"
"time"
"cursor-api-proxy/internal/svc"
apitypes "cursor-api-proxy/internal/types"
"cursor-api-proxy/pkg/adapter/anthropic"
"cursor-api-proxy/pkg/adapter/openai"
"cursor-api-proxy/pkg/domain/types"
"cursor-api-proxy/pkg/infrastructure/httputil"
"cursor-api-proxy/pkg/infrastructure/logger"
"cursor-api-proxy/pkg/infrastructure/parser"
"cursor-api-proxy/pkg/infrastructure/winlimit"
"cursor-api-proxy/pkg/infrastructure/workspace"
"cursor-api-proxy/pkg/usecase"
"github.com/google/uuid"
"github.com/zeromicro/go-zero/core/logx"
)
type AnthropicMessagesLogic struct {
logx.Logger
ctx context.Context
svcCtx *svc.ServiceContext
}
func NewAnthropicMessagesLogic(ctx context.Context, svcCtx *svc.ServiceContext) *AnthropicMessagesLogic {
return &AnthropicMessagesLogic{
Logger: logx.WithContext(ctx),
ctx: ctx,
svcCtx: svcCtx,
}
}
func (l *AnthropicMessagesLogic) resolveModel(requested string, lastModelRef *string) string {
cfg := l.svcCtx.Config
isAuto := requested == "auto"
var explicitModel string
if requested != "" && !isAuto {
explicitModel = requested
}
if explicitModel != "" {
*lastModelRef = explicitModel
}
if isAuto {
return "auto"
}
if explicitModel != "" {
return explicitModel
}
if cfg.StrictModel && *lastModelRef != "" {
return *lastModelRef
}
if *lastModelRef != "" {
return *lastModelRef
}
return cfg.DefaultModel
}
func (l *AnthropicMessagesLogic) AnthropicMessages(req *apitypes.AnthropicRequest, w http.ResponseWriter, method, pathname string) error {
return fmt.Errorf("non-streaming not implemented for Anthropic Messages API, use stream=true")
}
func (l *AnthropicMessagesLogic) AnthropicMessagesStream(req *apitypes.AnthropicRequest, w http.ResponseWriter, method, pathname string) error {
cfg := l.svcCtx.Config.ToBridgeConfig()
requested := openai.NormalizeModelID(req.Model)
model := l.resolveModel(requested, l.svcCtx.LastModel)
cursorModel := types.ResolveToCursorModel(model)
if cursorModel == "" {
cursorModel = model
}
// Convert messages
cleanMessages := convertAnthropicMessagesToInterface(req.Messages)
cleanMessages = usecase.SanitizeMessages(cleanMessages)
// Build prompt
systemText := req.System
var systemWithTools interface{} = systemText
if len(req.Tools) > 0 {
toolsText := openai.ToolsToSystemText(convertToolsToInterface(req.Tools), nil)
if systemText != "" {
systemWithTools = systemText + "\n\n" + toolsText
} else {
systemWithTools = toolsText
}
}
prompt := anthropic.BuildPromptFromAnthropicMessages(convertToAnthropicParams(cleanMessages), systemWithTools)
// Validate max_tokens
if req.MaxTokens == 0 {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"type": "invalid_request_error", "message": "max_tokens is required"},
}, nil)
return nil
}
// Log traffic
var trafficMsgs []logger.TrafficMessage
if systemText != "" {
trafficMsgs = append(trafficMsgs, logger.TrafficMessage{Role: "system", Content: systemText})
}
for _, m := range cleanMessages {
if mm, ok := m.(map[string]interface{}); ok {
role, _ := mm["role"].(string)
content := openai.MessageContentToText(mm["content"])
trafficMsgs = append(trafficMsgs, logger.TrafficMessage{Role: role, Content: content})
}
}
logger.LogTrafficRequest(cfg.Verbose, model, trafficMsgs, true)
// Resolve workspace
ws := workspace.ResolveWorkspace(cfg, "")
// Build command args
if cfg.Verbose {
logger.LogDebug("model=%s prompt_len=%d", cursorModel, len(prompt))
}
maxCmdline := cfg.WinCmdlineMax
if maxCmdline == 0 {
maxCmdline = 32768
}
fixedArgs := usecase.BuildAgentFixedArgs(cfg, ws.WorkspaceDir, cursorModel, true)
fit := winlimit.FitPromptToWinCmdline(cfg.AgentBin, fixedArgs, prompt, maxCmdline, ws.WorkspaceDir)
if cfg.Verbose {
logger.LogDebug("cmd_args=%v", fit.Args)
}
if !fit.OK {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"type": "api_error", "message": fit.Error},
}, nil)
return nil
}
if fit.Truncated {
logger.LogTruncation(fit.OriginalLength, fit.FinalPromptLength)
}
cmdArgs := fit.Args
msgID := "msg_" + uuid.New().String()
var truncatedHeaders map[string]string
if fit.Truncated {
truncatedHeaders = map[string]string{"X-Cursor-Proxy-Prompt-Truncated": "true"}
}
hasTools := len(req.Tools) > 0
var toolNames map[string]bool
if hasTools {
toolNames = usecase.CollectToolNames(convertToolsToInterface(req.Tools))
}
// Write SSE headers
httputil.WriteSSEHeaders(w, truncatedHeaders)
flusher, _ := w.(http.Flusher)
var p parser.Parser
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "message_start",
"message": map[string]interface{}{
"id": msgID,
"type": "message",
"role": "assistant",
"model": model,
"content": []interface{}{},
},
})
if hasTools {
p = createAnthropicToolParser(w, flusher, model, toolNames, cfg.Verbose)
} else {
p = createAnthropicStreamParser(w, flusher, model, cfg.Verbose)
}
configDir := l.svcCtx.AccountPool.GetNextConfigDir()
logger.LogAccountAssigned(configDir)
l.svcCtx.AccountPool.ReportRequestStart(configDir)
logger.LogRequestStart(method, pathname, model, cfg.TimeoutMs, true)
streamStart := time.Now().UnixMilli()
wrappedParser := func(line string) {
logger.LogRawLine(line)
p.Parse(line)
}
result, err := usecase.RunAgentStreamWithContext(cfg, ws.WorkspaceDir, cmdArgs, wrappedParser, ws.TempDir, configDir, l.ctx)
if l.ctx.Err() == nil {
p.Flush()
}
latencyMs := time.Now().UnixMilli() - streamStart
l.svcCtx.AccountPool.ReportRequestEnd(configDir)
if l.ctx.Err() == context.DeadlineExceeded {
logger.LogRequestTimeout(method, pathname, model, cfg.TimeoutMs)
} else if l.ctx.Err() == context.Canceled {
logger.LogClientDisconnect(method, pathname, model, latencyMs)
} else if err == nil && isRateLimited(result.Stderr) {
l.svcCtx.AccountPool.ReportRateLimit(configDir, extractRetryAfterMs(result.Stderr))
}
if err != nil || (result.Code != 0 && l.ctx.Err() == nil) {
l.svcCtx.AccountPool.ReportRequestError(configDir, latencyMs)
errMsg := "unknown error"
if err != nil {
errMsg = err.Error()
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, "", -1, errMsg)
} else {
errMsg = result.Stderr
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, "", result.Code, result.Stderr)
}
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "error",
"error": map[string]interface{}{"type": "api_error", "message": errMsg},
})
logger.LogRequestDone(method, pathname, model, latencyMs, result.Code)
} else if l.ctx.Err() == nil {
l.svcCtx.AccountPool.ReportRequestSuccess(configDir, latencyMs)
logger.LogRequestDone(method, pathname, model, latencyMs, 0)
}
logger.LogAccountStats(cfg.Verbose, l.svcCtx.AccountPool.GetStats())
return nil
}
func createAnthropicStreamParser(w http.ResponseWriter, flusher http.Flusher, model string, verbose bool) parser.Parser {
var textBlockOpen bool
var textBlockIndex int
var thinkingOpen bool
var thinkingBlockIndex int
var blockCount int
return parser.CreateStreamParserWithThinking(
func(text string) {
if verbose {
logger.LogStreamChunk(model, text, 0)
}
if !textBlockOpen && !thinkingOpen {
textBlockIndex = blockCount
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_start",
"index": textBlockIndex,
"content_block": map[string]string{"type": "text", "text": ""},
})
textBlockOpen = true
blockCount++
}
if thinkingOpen {
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_stop", "index": thinkingBlockIndex,
})
thinkingOpen = false
}
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_delta",
"index": textBlockIndex,
"delta": map[string]string{"type": "text_delta", "text": text},
})
},
func(thinking string) {
if verbose {
logger.LogStreamChunk(model, thinking, 0)
}
if !thinkingOpen {
thinkingBlockIndex = blockCount
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_start",
"index": thinkingBlockIndex,
"content_block": map[string]string{"type": "thinking", "thinking": ""},
})
thinkingOpen = true
blockCount++
}
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_delta",
"index": thinkingBlockIndex,
"delta": map[string]string{"type": "thinking_delta", "thinking": thinking},
})
},
func() {
if textBlockOpen {
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_stop", "index": textBlockIndex,
})
}
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "message_delta",
"delta": map[string]interface{}{"stop_reason": "end_turn", "stop_sequence": nil},
"usage": map[string]int{"output_tokens": 0},
})
writeAnthropicEvent(w, flusher, map[string]interface{}{"type": "message_stop"})
if flusher != nil {
flusher.Flush()
}
},
)
}
func createAnthropicToolParser(w http.ResponseWriter, flusher http.Flusher, model string, toolNames map[string]bool, verbose bool) parser.Parser {
var accumulated string
toolCallMarkerRe := regexp.MustCompile(`行政法规|<function_calls>`)
var toolCallMode bool
var textBlockOpen bool
var textBlockIndex int
var blockCount int
return parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
if verbose {
logger.LogStreamChunk(model, text, 0)
}
if toolCallMode {
return
}
if toolCallMarkerRe.MatchString(text) {
if textBlockOpen {
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_stop", "index": textBlockIndex,
})
textBlockOpen = false
}
toolCallMode = true
return
}
if !textBlockOpen {
textBlockIndex = blockCount
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_start",
"index": textBlockIndex,
"content_block": map[string]string{"type": "text", "text": ""},
})
textBlockOpen = true
blockCount++
}
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_delta",
"index": textBlockIndex,
"delta": map[string]string{"type": "text_delta", "text": text},
})
},
func(thinking string) {},
func() {
if verbose {
logger.LogTrafficResponse(verbose, model, accumulated, true)
}
parsed := usecase.ExtractToolCalls(accumulated, toolNames)
blockIndex := 0
if textBlockOpen {
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_stop", "index": textBlockIndex,
})
blockIndex = textBlockIndex + 1
}
if parsed.HasToolCalls() {
for _, tc := range parsed.ToolCalls {
toolID := "toolu_" + uuid.New().String()[:12]
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_start", "index": blockIndex,
"content_block": map[string]interface{}{
"type": "tool_use", "id": toolID, "name": tc.Name, "input": map[string]interface{}{},
},
})
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_delta", "index": blockIndex,
"delta": map[string]interface{}{
"type": "input_json_delta", "partial_json": tc.Arguments,
},
})
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "content_block_stop", "index": blockIndex,
})
blockIndex++
}
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "message_delta",
"delta": map[string]interface{}{"stop_reason": "tool_use", "stop_sequence": nil},
"usage": map[string]int{"output_tokens": 0},
})
} else {
writeAnthropicEvent(w, flusher, map[string]interface{}{
"type": "message_delta",
"delta": map[string]interface{}{"stop_reason": "end_turn", "stop_sequence": nil},
"usage": map[string]int{"output_tokens": 0},
})
}
writeAnthropicEvent(w, flusher, map[string]interface{}{"type": "message_stop"})
if flusher != nil {
flusher.Flush()
}
},
)
}
func writeAnthropicEvent(w http.ResponseWriter, flusher http.Flusher, evt interface{}) {
data, _ := json.Marshal(evt)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
}
func convertAnthropicMessagesToInterface(msgs []apitypes.Message) []interface{} {
result := make([]interface{}, len(msgs))
for i, m := range msgs {
result[i] = map[string]interface{}{
"role": m.Role,
"content": m.Content,
}
}
return result
}
func convertToAnthropicParams(msgs []interface{}) []anthropic.MessageParam {
result := make([]anthropic.MessageParam, len(msgs))
for i, m := range msgs {
if mm, ok := m.(map[string]interface{}); ok {
result[i] = anthropic.MessageParam{
Role: mm["role"].(string),
Content: mm["content"],
}
}
}
return result
}
func convertToolsToInterface(tools []apitypes.Tool) []interface{} {
if tools == nil {
return nil
}
result := make([]interface{}, len(tools))
for i, t := range tools {
result[i] = map[string]interface{}{
"type": t.Type,
"function": map[string]interface{}{
"name": t.Function.Name,
"description": t.Function.Description,
"parameters": t.Function.Parameters,
},
}
}
return result
}

View File

@ -0,0 +1,483 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"context"
"encoding/json"
"fmt"
"net/http"
"regexp"
"strconv"
"strings"
"time"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/svc"
apitypes "cursor-api-proxy/internal/types"
"cursor-api-proxy/pkg/adapter/openai"
"cursor-api-proxy/pkg/domain/types"
"cursor-api-proxy/pkg/infrastructure/httputil"
"cursor-api-proxy/pkg/infrastructure/logger"
"cursor-api-proxy/pkg/infrastructure/parser"
"cursor-api-proxy/pkg/infrastructure/winlimit"
"cursor-api-proxy/pkg/infrastructure/workspace"
"cursor-api-proxy/pkg/usecase"
"github.com/google/uuid"
"github.com/zeromicro/go-zero/core/logx"
)
var rateLimitRe = regexp.MustCompile(`(?i)\b429\b|rate.?limit|too many requests`)
var retryAfterRe = regexp.MustCompile(`(?i)retry-after:\s*(\d+)`)
func isRateLimited(stderr string) bool {
return rateLimitRe.MatchString(stderr)
}
func extractRetryAfterMs(stderr string) int64 {
if m := retryAfterRe.FindStringSubmatch(stderr); len(m) > 1 {
if secs, err := strconv.ParseInt(m[1], 10, 64); err == nil && secs > 0 {
return secs * 1000
}
}
return 60000
}
type ChatCompletionsLogic struct {
logx.Logger
ctx context.Context
svcCtx *svc.ServiceContext
}
func NewChatCompletionsLogic(ctx context.Context, svcCtx *svc.ServiceContext) *ChatCompletionsLogic {
return &ChatCompletionsLogic{
Logger: logx.WithContext(ctx),
ctx: ctx,
svcCtx: svcCtx,
}
}
func (l *ChatCompletionsLogic) resolveModel(requested string, lastModelRef *string) string {
cfg := l.svcCtx.Config
isAuto := requested == "auto"
var explicitModel string
if requested != "" && !isAuto {
explicitModel = requested
}
if explicitModel != "" {
*lastModelRef = explicitModel
}
if isAuto {
return "auto"
}
if explicitModel != "" {
return explicitModel
}
if cfg.StrictModel && *lastModelRef != "" {
return *lastModelRef
}
if *lastModelRef != "" {
return *lastModelRef
}
return cfg.DefaultModel
}
func (l *ChatCompletionsLogic) ChatCompletions(req *apitypes.ChatCompletionRequest) (*apitypes.ChatCompletionResponse, error) {
return nil, fmt.Errorf("non-streaming not yet implemented, use stream=true")
}
func (l *ChatCompletionsLogic) ChatCompletionsStream(req *apitypes.ChatCompletionRequest, w http.ResponseWriter, method, pathname string) error {
cfg := configToBridge(l.svcCtx.Config)
rawModel := req.Model
requested := openai.NormalizeModelID(rawModel)
lastModelRef := new(string)
model := l.resolveModel(requested, lastModelRef)
cursorModel := types.ResolveToCursorModel(model)
if cursorModel == "" {
cursorModel = model
}
messages := convertMessages(req.Messages)
tools := convertTools(req.Tools)
functions := convertFunctions(req.Functions)
cleanMessages := usecase.SanitizeMessages(messages)
toolsText := openai.ToolsToSystemText(tools, functions)
messagesWithTools := cleanMessages
if toolsText != "" {
messagesWithTools = append([]interface{}{
map[string]interface{}{"role": "system", "content": toolsText},
}, cleanMessages...)
}
prompt := openai.BuildPromptFromMessages(messagesWithTools)
var trafficMsgs []logger.TrafficMessage
for _, raw := range cleanMessages {
if m, ok := raw.(map[string]interface{}); ok {
role, _ := m["role"].(string)
content := openai.MessageContentToText(m["content"])
trafficMsgs = append(trafficMsgs, logger.TrafficMessage{Role: role, Content: content})
}
}
logger.LogTrafficRequest(cfg.Verbose, model, trafficMsgs, true)
ws := workspace.ResolveWorkspace(cfg, "")
promptLen := len(prompt)
if cfg.Verbose {
if promptLen > 200 {
logger.LogDebug("model=%s prompt_len=%d prompt_start=%q", cursorModel, promptLen, prompt[:200])
} else {
logger.LogDebug("model=%s prompt_len=%d prompt=%q", cursorModel, promptLen, prompt)
}
}
maxCmdline := cfg.WinCmdlineMax
if maxCmdline == 0 {
maxCmdline = 32768
}
fixedArgs := usecase.BuildAgentFixedArgs(cfg, ws.WorkspaceDir, cursorModel, true)
fit := winlimit.FitPromptToWinCmdline(cfg.AgentBin, fixedArgs, prompt, maxCmdline, ws.WorkspaceDir)
if l.svcCtx.Config.Verbose {
logger.LogDebug("cmd=%s args=%v", cfg.AgentBin, fit.Args)
}
if !fit.OK {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": fit.Error, "code": "windows_cmdline_limit"},
}, nil)
return nil
}
if fit.Truncated {
logger.LogTruncation(fit.OriginalLength, fit.FinalPromptLength)
}
cmdArgs := fit.Args
id := "chatcmpl_" + uuid.New().String()
created := time.Now().Unix()
var truncatedHeaders map[string]string
if fit.Truncated {
truncatedHeaders = map[string]string{"X-Cursor-Proxy-Prompt-Truncated": "true"}
}
hasTools := len(tools) > 0 || len(functions) > 0
var toolNames map[string]bool
if hasTools {
toolNames = usecase.CollectToolNames(tools)
for _, f := range functions {
if fm, ok := f.(map[string]interface{}); ok {
if name, ok := fm["name"].(string); ok {
toolNames[name] = true
}
}
}
}
httputil.WriteSSEHeaders(w, truncatedHeaders)
flusher, _ := w.(http.Flusher)
var accumulated string
var chunkNum int
var p parser.Parser
toolCallMarkerRe := regexp.MustCompile(`\x1e|<function_calls>`)
if hasTools {
var toolCallMode bool
p = parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
chunkNum++
logger.LogStreamChunk(model, text, chunkNum)
if toolCallMode {
return
}
if toolCallMarkerRe.MatchString(text) {
toolCallMode = true
return
}
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]string{"content": text}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func(thinking string) {
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{"reasoning_content": thinking}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func() {
logger.LogTrafficResponse(cfg.Verbose, model, accumulated, true)
parsed := usecase.ExtractToolCalls(accumulated, toolNames)
if parsed.HasToolCalls() {
if parsed.TextContent != "" && toolCallMode {
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{"role": "assistant", "content": parsed.TextContent}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
}
for i, tc := range parsed.ToolCalls {
callID := "call_" + uuid.New().String()[:8]
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{
"tool_calls": []map[string]interface{}{
{
"index": i,
"id": callID,
"type": "function",
"function": map[string]interface{}{
"name": tc.Name,
"arguments": tc.Arguments,
},
},
},
}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
}
stopChunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{}, "finish_reason": "tool_calls"},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
} else {
stopChunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{}, "finish_reason": "stop"},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
}
},
)
} else {
p = parser.CreateStreamParserWithThinking(
func(text string) {
accumulated += text
chunkNum++
logger.LogStreamChunk(model, text, chunkNum)
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]string{"content": text}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func(thinking string) {
chunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{"reasoning_content": thinking}, "finish_reason": nil},
},
}
data, _ := json.Marshal(chunk)
fmt.Fprintf(w, "data: %s\n\n", data)
if flusher != nil {
flusher.Flush()
}
},
func() {
logger.LogTrafficResponse(cfg.Verbose, model, accumulated, true)
stopChunk := map[string]interface{}{
"id": id, "object": "chat.completion.chunk", "created": created, "model": model,
"choices": []map[string]interface{}{
{"index": 0, "delta": map[string]interface{}{}, "finish_reason": "stop"},
},
}
data, _ := json.Marshal(stopChunk)
fmt.Fprintf(w, "data: %s\n\n", data)
fmt.Fprintf(w, "data: [DONE]\n\n")
if flusher != nil {
flusher.Flush()
}
},
)
}
configDir := l.svcCtx.AccountPool.GetNextConfigDir()
logger.LogAccountAssigned(configDir)
l.svcCtx.AccountPool.ReportRequestStart(configDir)
logger.LogRequestStart(method, pathname, model, cfg.TimeoutMs, true)
streamStart := time.Now().UnixMilli()
wrappedParser := func(line string) {
logger.LogRawLine(line)
p.Parse(line)
}
result, err := usecase.RunAgentStreamWithContext(cfg, ws.WorkspaceDir, cmdArgs, wrappedParser, ws.TempDir, configDir, l.ctx)
if l.ctx.Err() == nil {
p.Flush()
}
latencyMs := time.Now().UnixMilli() - streamStart
l.svcCtx.AccountPool.ReportRequestEnd(configDir)
if l.ctx.Err() == context.DeadlineExceeded {
logger.LogRequestTimeout(method, pathname, model, cfg.TimeoutMs)
} else if l.ctx.Err() == context.Canceled {
logger.LogClientDisconnect(method, pathname, model, latencyMs)
} else if err == nil && isRateLimited(result.Stderr) {
l.svcCtx.AccountPool.ReportRateLimit(configDir, extractRetryAfterMs(result.Stderr))
}
if err != nil || (result.Code != 0 && l.ctx.Err() == nil) {
l.svcCtx.AccountPool.ReportRequestError(configDir, latencyMs)
if err != nil {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, "", -1, err.Error())
} else {
logger.LogAgentError(cfg.SessionsLogPath, method, pathname, "", result.Code, result.Stderr)
}
logger.LogRequestDone(method, pathname, model, latencyMs, result.Code)
} else if l.ctx.Err() == nil {
l.svcCtx.AccountPool.ReportRequestSuccess(configDir, latencyMs)
logger.LogRequestDone(method, pathname, model, latencyMs, 0)
}
logger.LogAccountStats(cfg.Verbose, l.svcCtx.AccountPool.GetStats())
return nil
}
func convertMessages(msgs []apitypes.Message) []interface{} {
result := make([]interface{}, len(msgs))
for i, m := range msgs {
result[i] = map[string]interface{}{
"role": m.Role,
"content": m.Content,
}
}
return result
}
func convertTools(tools []apitypes.Tool) []interface{} {
if tools == nil {
return nil
}
result := make([]interface{}, len(tools))
for i, t := range tools {
result[i] = map[string]interface{}{
"type": t.Type,
"function": map[string]interface{}{
"name": t.Function.Name,
"description": t.Function.Description,
"parameters": t.Function.Parameters,
},
}
}
return result
}
func convertFunctions(funcs []apitypes.Function) []interface{} {
if funcs == nil {
return nil
}
result := make([]interface{}, len(funcs))
for i, f := range funcs {
result[i] = map[string]interface{}{
"name": f.Name,
"description": f.Description,
"parameters": f.Parameters,
}
}
return result
}
func configToBridge(c config.Config) config.BridgeConfig {
host := c.Host
if host == "" {
host = "0.0.0.0"
}
return config.BridgeConfig{
AgentBin: c.AgentBin,
Host: host,
Port: c.Port,
RequiredKey: c.RequiredKey,
DefaultModel: c.DefaultModel,
Mode: "ask",
Provider: c.Provider,
Force: c.Force,
ApproveMcps: c.ApproveMcps,
StrictModel: c.StrictModel,
Workspace: c.Workspace,
TimeoutMs: c.TimeoutMs,
TLSCertPath: c.TLSCertPath,
TLSKeyPath: c.TLSKeyPath,
SessionsLogPath: c.SessionsLogPath,
ChatOnlyWorkspace: c.ChatOnlyWorkspace,
Verbose: c.Verbose,
MaxMode: c.MaxMode,
ConfigDirs: c.ConfigDirs,
MultiPort: c.MultiPort,
WinCmdlineMax: c.WinCmdlineMax,
GeminiAccountDir: c.GeminiAccountDir,
GeminiBrowserVisible: c.GeminiBrowserVisible,
GeminiMaxSessions: c.GeminiMaxSessions,
}
}
// StringsToMapSlice converts string slice for compatibility
func StringsToMapSlice(ss []string) []map[string]string {
result := make([]map[string]string, len(ss))
for i, s := range ss {
result[i] = map[string]string{"content": s}
}
return result
}
// JoinStrings joins strings with newline
func JoinStrings(ss []string) string {
return strings.Join(ss, "\n")
}

View File

@ -0,0 +1,34 @@
// Code scaffolded by goctl. Safe to edit.
// goctl 1.10.1
package chat
import (
"context"
"cursor-api-proxy/internal/svc"
"cursor-api-proxy/internal/types"
"github.com/zeromicro/go-zero/core/logx"
)
type HealthLogic struct {
logx.Logger
ctx context.Context
svcCtx *svc.ServiceContext
}
func NewHealthLogic(ctx context.Context, svcCtx *svc.ServiceContext) *HealthLogic {
return &HealthLogic{
Logger: logx.WithContext(ctx),
ctx: ctx,
svcCtx: svcCtx,
}
}
func (l *HealthLogic) Health() (resp *types.HealthResponse, err error) {
return &types.HealthResponse{
Status: "ok",
Version: "1.0.0",
}, nil
}

View File

@ -0,0 +1,118 @@
package chat
import (
"context"
"sync"
"time"
"cursor-api-proxy/internal/svc"
apitypes "cursor-api-proxy/internal/types"
"cursor-api-proxy/pkg/domain/types"
"github.com/zeromicro/go-zero/core/logx"
)
const modelCacheTTLMs = 5 * 60 * 1000
type ModelCache struct {
At int64
Models []types.CursorCliModel
}
type ModelCacheRef struct {
mu sync.Mutex
cache *ModelCache
inflight bool
waiters []chan struct{}
}
var globalModelCache = &ModelCacheRef{}
type ModelsLogic struct {
logx.Logger
ctx context.Context
svcCtx *svc.ServiceContext
}
func NewModelsLogic(ctx context.Context, svcCtx *svc.ServiceContext) *ModelsLogic {
return &ModelsLogic{
Logger: logx.WithContext(ctx),
ctx: ctx,
svcCtx: svcCtx,
}
}
func (l *ModelsLogic) Models() (resp *apitypes.ModelsResponse, err error) {
now := time.Now().UnixMilli()
globalModelCache.mu.Lock()
if globalModelCache.cache != nil && now-globalModelCache.cache.At <= modelCacheTTLMs {
cache := globalModelCache.cache
globalModelCache.mu.Unlock()
return buildModelsResponse(cache.Models), nil
}
if globalModelCache.inflight {
ch := make(chan struct{}, 1)
globalModelCache.waiters = append(globalModelCache.waiters, ch)
globalModelCache.mu.Unlock()
<-ch
globalModelCache.mu.Lock()
cache := globalModelCache.cache
globalModelCache.mu.Unlock()
return buildModelsResponse(cache.Models), nil
}
globalModelCache.inflight = true
globalModelCache.mu.Unlock()
fetched, err := types.ListCursorCliModels(l.svcCtx.Config.AgentBin, l.svcCtx.Config.TimeoutMs)
globalModelCache.mu.Lock()
globalModelCache.inflight = false
if err == nil {
globalModelCache.cache = &ModelCache{At: time.Now().UnixMilli(), Models: fetched}
}
waiters := globalModelCache.waiters
globalModelCache.waiters = nil
globalModelCache.mu.Unlock()
for _, ch := range waiters {
ch <- struct{}{}
}
if err != nil {
return nil, err
}
return buildModelsResponse(fetched), nil
}
func buildModelsResponse(mods []types.CursorCliModel) *apitypes.ModelsResponse {
models := make([]apitypes.ModelData, len(mods))
for i, m := range mods {
models[i] = apitypes.ModelData{
Id: m.ID,
Object: "model",
OwnedBy: "cursor",
}
}
ids := make([]string, len(mods))
for i, m := range mods {
ids[i] = m.ID
}
aliases := types.GetAnthropicModelAliases(ids)
for _, a := range aliases {
models = append(models, apitypes.ModelData{
Id: a.ID,
Object: "model",
OwnedBy: "cursor",
})
}
return &apitypes.ModelsResponse{
Object: "list",
Data: models,
}
}

View File

@ -1,62 +0,0 @@
package models
import (
"cursor-api-proxy/internal/process"
"fmt"
"os"
"regexp"
"strings"
)
type CursorCliModel struct {
ID string
Name string
}
var modelLineRe = regexp.MustCompile(`^([A-Za-z0-9][A-Za-z0-9._:/-]*)\s+-\s+(.*)$`)
var trailingParenRe = regexp.MustCompile(`\s*\([^)]*\)\s*$`)
func ParseCursorCliModels(output string) []CursorCliModel {
lines := strings.Split(output, "\n")
seen := make(map[string]CursorCliModel)
var order []string
for _, line := range lines {
line = strings.TrimSpace(line)
m := modelLineRe.FindStringSubmatch(line)
if m == nil {
continue
}
id := m[1]
rawName := m[2]
name := strings.TrimSpace(trailingParenRe.ReplaceAllString(rawName, ""))
if name == "" {
name = id
}
if _, exists := seen[id]; !exists {
seen[id] = CursorCliModel{ID: id, Name: name}
order = append(order, id)
}
}
result := make([]CursorCliModel, 0, len(order))
for _, id := range order {
result = append(result, seen[id])
}
return result
}
func ListCursorCliModels(agentBin string, timeoutMs int) ([]CursorCliModel, error) {
tmpDir := os.TempDir()
result, err := process.Run(agentBin, []string{"--list-models"}, process.RunOptions{
Cwd: tmpDir,
TimeoutMs: timeoutMs,
})
if err != nil {
return nil, err
}
if result.Code != 0 {
return nil, fmt.Errorf("agent --list-models failed: %s", strings.TrimSpace(result.Stderr))
}
return ParseCursorCliModels(result.Stdout), nil
}

View File

@ -1,33 +0,0 @@
package models
import "testing"
func TestParseCursorCliModels(t *testing.T) {
output := `
gpt-4o - GPT-4o (some info)
claude-3-5-sonnet - Claude 3.5 Sonnet
gpt-4o - GPT-4o duplicate
invalid line without dash
`
result := ParseCursorCliModels(output)
if len(result) != 2 {
t.Fatalf("expected 2 unique models, got %d: %v", len(result), result)
}
if result[0].ID != "gpt-4o" {
t.Errorf("expected gpt-4o, got %s", result[0].ID)
}
if result[0].Name != "GPT-4o" {
t.Errorf("expected 'GPT-4o', got %s", result[0].Name)
}
if result[1].ID != "claude-3-5-sonnet" {
t.Errorf("expected claude-3-5-sonnet, got %s", result[1].ID)
}
}
func TestParseCursorCliModelsEmpty(t *testing.T) {
result := ParseCursorCliModels("")
if len(result) != 0 {
t.Fatalf("expected empty, got %v", result)
}
}

View File

@ -1,163 +0,0 @@
package models
import "testing"
func TestGetAnthropicModelAliases_StaticOnly(t *testing.T) {
aliases := GetAnthropicModelAliases([]string{"sonnet-4.6", "opus-4.5"})
if len(aliases) != 2 {
t.Fatalf("expected 2 aliases, got %d: %v", len(aliases), aliases)
}
ids := map[string]string{}
for _, a := range aliases {
ids[a.ID] = a.Name
}
if ids["claude-sonnet-4-6"] != "Claude 4.6 Sonnet" {
t.Errorf("unexpected name for claude-sonnet-4-6: %s", ids["claude-sonnet-4-6"])
}
if ids["claude-opus-4-5"] != "Claude 4.5 Opus" {
t.Errorf("unexpected name for claude-opus-4-5: %s", ids["claude-opus-4-5"])
}
}
func TestGetAnthropicModelAliases_DynamicFallback(t *testing.T) {
aliases := GetAnthropicModelAliases([]string{"sonnet-4.7", "opus-5.0-thinking", "gpt-4o"})
ids := map[string]string{}
for _, a := range aliases {
ids[a.ID] = a.Name
}
if ids["claude-sonnet-4-7"] != "Sonnet 4.7" {
t.Errorf("unexpected name for claude-sonnet-4-7: %s", ids["claude-sonnet-4-7"])
}
if ids["claude-opus-5-0-thinking"] != "Opus 5.0 (Thinking)" {
t.Errorf("unexpected name for claude-opus-5-0-thinking: %s", ids["claude-opus-5-0-thinking"])
}
if _, ok := ids["claude-gpt-4o"]; ok {
t.Errorf("gpt-4o should not generate a claude alias")
}
}
func TestGetAnthropicModelAliases_Mixed(t *testing.T) {
aliases := GetAnthropicModelAliases([]string{"sonnet-4.6", "opus-4.7", "gpt-4o"})
ids := map[string]string{}
for _, a := range aliases {
ids[a.ID] = a.Name
}
// static entry keeps its custom name
if ids["claude-sonnet-4-6"] != "Claude 4.6 Sonnet" {
t.Errorf("static alias should keep original name, got: %s", ids["claude-sonnet-4-6"])
}
// dynamic entry uses auto-generated name
if ids["claude-opus-4-7"] != "Opus 4.7" {
t.Errorf("dynamic alias name mismatch: %s", ids["claude-opus-4-7"])
}
}
func TestGetAnthropicModelAliases_UnknownPattern(t *testing.T) {
aliases := GetAnthropicModelAliases([]string{"some-unknown-model"})
if len(aliases) != 0 {
t.Fatalf("expected 0 aliases for unknown pattern, got %d: %v", len(aliases), aliases)
}
}
func TestResolveToCursorModel_Static(t *testing.T) {
tests := []struct {
input string
want string
}{
{"claude-opus-4-6", "opus-4.6"},
{"claude-opus-4.6", "opus-4.6"},
{"claude-sonnet-4-5", "sonnet-4.5"},
{"claude-opus-4-6-thinking", "opus-4.6-thinking"},
}
for _, tc := range tests {
got := ResolveToCursorModel(tc.input)
if got != tc.want {
t.Errorf("ResolveToCursorModel(%q) = %q, want %q", tc.input, got, tc.want)
}
}
}
func TestResolveToCursorModel_DynamicFallback(t *testing.T) {
tests := []struct {
input string
want string
}{
{"claude-opus-4-7", "opus-4.7"},
{"claude-sonnet-5-0", "sonnet-5.0"},
{"claude-opus-4-7-thinking", "opus-4.7-thinking"},
{"claude-sonnet-5-0-thinking", "sonnet-5.0-thinking"},
}
for _, tc := range tests {
got := ResolveToCursorModel(tc.input)
if got != tc.want {
t.Errorf("ResolveToCursorModel(%q) = %q, want %q", tc.input, got, tc.want)
}
}
}
func TestResolveToCursorModel_Passthrough(t *testing.T) {
tests := []string{"sonnet-4.6", "gpt-4o", "custom-model"}
for _, input := range tests {
got := ResolveToCursorModel(input)
if got != input {
t.Errorf("ResolveToCursorModel(%q) = %q, want passthrough %q", input, got, input)
}
}
}
func TestResolveToCursorModel_Empty(t *testing.T) {
if got := ResolveToCursorModel(""); got != "" {
t.Errorf("ResolveToCursorModel(\"\") = %q, want empty", got)
}
if got := ResolveToCursorModel(" "); got != "" {
t.Errorf("ResolveToCursorModel(\" \") = %q, want empty", got)
}
}
func TestGenerateDynamicAlias(t *testing.T) {
tests := []struct {
input string
want AnthropicAlias
ok bool
}{
{"opus-4.7", AnthropicAlias{"claude-opus-4-7", "Opus 4.7"}, true},
{"sonnet-5.0-thinking", AnthropicAlias{"claude-sonnet-5-0-thinking", "Sonnet 5.0 (Thinking)"}, true},
{"gpt-4o", AnthropicAlias{}, false},
{"invalid", AnthropicAlias{}, false},
}
for _, tc := range tests {
got, ok := generateDynamicAlias(tc.input)
if ok != tc.ok {
t.Errorf("generateDynamicAlias(%q) ok = %v, want %v", tc.input, ok, tc.ok)
continue
}
if ok && (got.ID != tc.want.ID || got.Name != tc.want.Name) {
t.Errorf("generateDynamicAlias(%q) = {%q, %q}, want {%q, %q}", tc.input, got.ID, got.Name, tc.want.ID, tc.want.Name)
}
}
}
func TestReverseDynamicAlias(t *testing.T) {
tests := []struct {
input string
want string
ok bool
}{
{"claude-opus-4-7", "opus-4.7", true},
{"claude-sonnet-5-0-thinking", "sonnet-5.0-thinking", true},
{"claude-opus-4-6", "opus-4.6", true},
{"claude-opus-4.6", "", false},
{"claude-haiku-4-5-20251001", "", false},
{"some-model", "", false},
}
for _, tc := range tests {
got, ok := reverseDynamicAlias(tc.input)
if ok != tc.ok {
t.Errorf("reverseDynamicAlias(%q) ok = %v, want %v", tc.input, ok, tc.ok)
continue
}
if ok && got != tc.want {
t.Errorf("reverseDynamicAlias(%q) = %q, want %q", tc.input, got, tc.want)
}
}
}

View File

@ -1,32 +0,0 @@
package providers
import (
"context"
"cursor-api-proxy/internal/apitypes"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/providers/cursor"
"cursor-api-proxy/internal/providers/geminiweb"
"fmt"
)
type Provider interface {
Name() string
Close() error
Generate(ctx context.Context, model string, messages []apitypes.Message, tools []apitypes.Tool, cb func(apitypes.StreamChunk)) error
}
func NewProvider(cfg config.BridgeConfig) (Provider, error) {
providerType := cfg.Provider
if providerType == "" {
providerType = "cursor"
}
switch providerType {
case "cursor":
return cursor.NewProvider(cfg), nil
case "gemini-web":
return geminiweb.NewPlaywrightProvider(cfg)
default:
return nil, fmt.Errorf("unknown provider: %s", providerType)
}
}

View File

@ -1,147 +0,0 @@
package router
import (
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/handlers"
"cursor-api-proxy/internal/httputil"
"cursor-api-proxy/internal/logger"
"cursor-api-proxy/internal/pool"
"fmt"
"net/http"
"os"
"time"
)
type RouterOptions struct {
Version string
Config config.BridgeConfig
ModelCache *handlers.ModelCacheRef
LastModel *string
Pool pool.PoolHandle
}
func NewRouter(opts RouterOptions) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
cfg := opts.Config
pathname := r.URL.Path
method := r.Method
remoteAddress := r.RemoteAddr
if r.Header.Get("X-Real-IP") != "" {
remoteAddress = r.Header.Get("X-Real-IP")
}
logger.LogIncoming(method, pathname, remoteAddress)
defer func() {
logger.AppendSessionLine(cfg.SessionsLogPath, method, pathname, remoteAddress, 200)
}()
if cfg.RequiredKey != "" {
token := httputil.ExtractBearerToken(r)
if token != cfg.RequiredKey {
httputil.WriteJSON(w, 401, map[string]interface{}{
"error": map[string]string{"message": "Invalid API key", "code": "unauthorized"},
}, nil)
return
}
}
switch {
case method == "GET" && pathname == "/health":
handlers.HandleHealth(w, r, opts.Version, cfg)
case method == "GET" && pathname == "/v1/models":
opts.ModelCache.HandleModels(w, r, cfg)
case method == "POST" && pathname == "/v1/chat/completions":
raw, err := httputil.ReadBody(r)
if err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"message": "failed to read body", "code": "bad_request"},
}, nil)
return
}
// 根據 Provider 選擇處理方式
provider := cfg.Provider
if provider == "" {
provider = "cursor"
}
if provider == "gemini-web" {
handlers.HandleGeminiChatCompletions(w, r, cfg, raw, method, pathname, remoteAddress)
} else {
handlers.HandleChatCompletions(w, r, cfg, opts.Pool, opts.LastModel, raw, method, pathname, remoteAddress)
}
case method == "POST" && pathname == "/v1/messages":
raw, err := httputil.ReadBody(r)
if err != nil {
httputil.WriteJSON(w, 400, map[string]interface{}{
"error": map[string]string{"message": "failed to read body", "code": "bad_request"},
}, nil)
return
}
handlers.HandleAnthropicMessages(w, r, cfg, opts.Pool, opts.LastModel, raw, method, pathname, remoteAddress)
case (method == "POST" || method == "GET") && pathname == "/v1/completions":
httputil.WriteJSON(w, 404, map[string]interface{}{
"error": map[string]string{
"message": "Legacy completions endpoint is not supported. Use POST /v1/chat/completions instead.",
"code": "not_found",
},
}, nil)
case pathname == "/v1/embeddings":
httputil.WriteJSON(w, 404, map[string]interface{}{
"error": map[string]string{
"message": "Embeddings are not supported by this proxy.",
"code": "not_found",
},
}, nil)
default:
httputil.WriteJSON(w, 404, map[string]interface{}{
"error": map[string]string{"message": "Not found", "code": "not_found"},
}, nil)
}
}
}
func recoveryMiddleware(logPath string, next http.HandlerFunc) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
defer func() {
if rec := recover(); rec != nil {
msg := fmt.Sprintf("%v", rec)
fmt.Fprintf(os.Stderr, "[%s] Proxy panic: %s\n", time.Now().UTC().Format(time.RFC3339), msg)
line := fmt.Sprintf("%s ERROR %s %s %s %s\n",
time.Now().UTC().Format(time.RFC3339), r.Method, r.URL.Path, r.RemoteAddr,
msg[:min(200, len(msg))])
if f, err := os.OpenFile(logPath, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0644); err == nil {
_, _ = f.WriteString(line)
f.Close()
}
if !isHeaderWritten(w) {
httputil.WriteJSON(w, 500, map[string]interface{}{
"error": map[string]string{"message": msg, "code": "internal_error"},
}, nil)
}
}
}()
next(w, r)
}
}
func isHeaderWritten(w http.ResponseWriter) bool {
// Can't reliably detect without wrapping; always try to write
return false
}
func min(a, b int) int {
if a < b {
return a
}
return b
}
func WrapWithRecovery(logPath string, handler http.HandlerFunc) http.HandlerFunc {
return recoveryMiddleware(logPath, handler)
}

View File

@ -1,159 +0,0 @@
package server
import (
"context"
"crypto/tls"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/handlers"
"cursor-api-proxy/internal/pool"
"cursor-api-proxy/internal/process"
"cursor-api-proxy/internal/logger"
"cursor-api-proxy/internal/router"
"fmt"
"net/http"
"os"
"os/signal"
"syscall"
"time"
)
type ServerOptions struct {
Version string
Config config.BridgeConfig
Pool pool.PoolHandle
}
func StartBridgeServer(opts ServerOptions) []*http.Server {
cfg := opts.Config
var servers []*http.Server
if len(cfg.ConfigDirs) > 0 {
if cfg.MultiPort {
for i, dir := range cfg.ConfigDirs {
port := cfg.Port + i
subCfg := cfg
subCfg.Port = port
subCfg.ConfigDirs = []string{dir}
subCfg.MultiPort = false
subPool := pool.NewAccountPool([]string{dir})
srv := startSingleServer(ServerOptions{Version: opts.Version, Config: subCfg, Pool: subPool})
servers = append(servers, srv)
}
return servers
}
pool.InitAccountPool(cfg.ConfigDirs)
}
servers = append(servers, startSingleServer(opts))
return servers
}
func startSingleServer(opts ServerOptions) *http.Server {
cfg := opts.Config
modelCache := &handlers.ModelCacheRef{}
lastModel := cfg.DefaultModel
ph := opts.Pool
if ph == nil {
ph = pool.GlobalPoolHandle{}
}
handler := router.NewRouter(router.RouterOptions{
Version: opts.Version,
Config: cfg,
ModelCache: modelCache,
LastModel: &lastModel,
Pool: ph,
})
handler = router.WrapWithRecovery(cfg.SessionsLogPath, handler)
useTLS := cfg.TLSCertPath != "" && cfg.TLSKeyPath != ""
srv := &http.Server{
Addr: fmt.Sprintf("%s:%d", cfg.Host, cfg.Port),
Handler: handler,
}
if useTLS {
cert, err := tls.LoadX509KeyPair(cfg.TLSCertPath, cfg.TLSKeyPath)
if err != nil {
fmt.Fprintf(os.Stderr, "TLS error: %v\n", err)
os.Exit(1)
}
srv.TLSConfig = &tls.Config{Certificates: []tls.Certificate{cert}}
}
scheme := "http"
if useTLS {
scheme = "https"
}
go func() {
var err error
if useTLS {
err = srv.ListenAndServeTLS("", "")
} else {
err = srv.ListenAndServe()
}
if err != nil && err != http.ErrServerClosed {
if isAddrInUse(err) {
fmt.Fprintf(os.Stderr, "❌ Port %d is already in use. Set CURSOR_BRIDGE_PORT to use a different port.\n", cfg.Port)
} else {
fmt.Fprintf(os.Stderr, "❌ Server error: %v\n", err)
}
os.Exit(1)
}
}()
logger.LogServerStart(opts.Version, scheme, cfg.Host, cfg.Port, cfg)
return srv
}
func SetupGracefulShutdown(servers []*http.Server, timeoutMs int) {
sigCh := make(chan os.Signal, 1)
signal.Notify(sigCh, syscall.SIGTERM, syscall.SIGINT)
go func() {
sig := <-sigCh
logger.LogShutdown(sig.String())
process.KillAllChildProcesses()
ctx, cancel := context.WithTimeout(context.Background(), time.Duration(timeoutMs)*time.Millisecond)
defer cancel()
done := make(chan struct{})
go func() {
for _, srv := range servers {
_ = srv.Shutdown(ctx)
}
close(done)
}()
select {
case <-done:
os.Exit(0)
case <-ctx.Done():
fmt.Fprintln(os.Stderr, "[shutdown] Timed out waiting for connections to drain — forcing exit.")
os.Exit(1)
}
}()
}
func isAddrInUse(err error) bool {
return err != nil && (contains(err.Error(), "address already in use") || contains(err.Error(), "bind: address already in use"))
}
func contains(s, sub string) bool {
return len(s) >= len(sub) && (s == sub || len(s) > 0 && containsHelper(s, sub))
}
func containsHelper(s, sub string) bool {
for i := 0; i <= len(s)-len(sub); i++ {
if s[i:i+len(sub)] == sub {
return true
}
}
return false
}

View File

@ -1,331 +0,0 @@
package server_test
import (
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/server"
"encoding/json"
"fmt"
"io"
"net"
"context"
"net/http"
"os"
"strings"
"testing"
"time"
)
// freePort 取得一個暫時可用的隨機 port
func freePort(t *testing.T) int {
t.Helper()
l, err := net.Listen("tcp", "127.0.0.1:0")
if err != nil {
t.Fatal(err)
}
port := l.Addr().(*net.TCPAddr).Port
l.Close()
return port
}
// makeFakeAgentBin 建立一個 shell script模擬 agent 固定輸出
// sync 模式:直接輸出一行文字
// stream 模式:輸出 JSON stream 行
func makeFakeAgentBin(t *testing.T, syncOutput string) string {
t.Helper()
dir := t.TempDir()
script := dir + "/agent"
content := fmt.Sprintf(`#!/bin/sh
# 若有 --stream-json 則輸出 stream 格式
for arg; do
if [ "$arg" = "--stream-json" ]; then
printf '%%s\n' '{"type":"assistant","message":{"content":[{"type":"text","text":"%s"}]}}'
printf '%%s\n' '{"type":"result","subtype":"success"}'
exit 0
fi
done
# 否則輸出 sync 格式
printf '%%s' '%s'
`, syncOutput, syncOutput)
if err := os.WriteFile(script, []byte(content), 0755); err != nil {
t.Fatal(err)
}
return script
}
// makeFakeAgentBinWithModels 額外支援 --list-models 輸出
func makeFakeAgentBinWithModels(t *testing.T) string {
t.Helper()
dir := t.TempDir()
script := dir + "/agent"
content := `#!/bin/sh
for arg; do
if [ "$arg" = "--list-models" ]; then
printf 'claude-3-opus - Claude 3 Opus\n'
printf 'claude-3-sonnet - Claude 3 Sonnet\n'
exit 0
fi
if [ "$arg" = "--stream-json" ]; then
printf '%s\n' '{"type":"assistant","message":{"content":[{"type":"text","text":"Hello"}]}}'
printf '%s\n' '{"type":"result","subtype":"success"}'
exit 0
fi
done
printf 'Hello from agent'
`
if err := os.WriteFile(script, []byte(content), 0755); err != nil {
t.Fatal(err)
}
return script
}
func makeTestConfig(agentBin string, port int, overrides ...func(*config.BridgeConfig)) config.BridgeConfig {
cfg := config.BridgeConfig{
AgentBin: agentBin,
Host: "127.0.0.1",
Port: port,
DefaultModel: "auto",
Mode: "ask",
Force: false,
ApproveMcps: false,
StrictModel: true,
Workspace: os.TempDir(),
TimeoutMs: 30000,
SessionsLogPath: os.TempDir() + "/test-sessions.log",
ChatOnlyWorkspace: true,
Verbose: false,
MaxMode: false,
ConfigDirs: []string{},
MultiPort: false,
WinCmdlineMax: 30000,
}
for _, fn := range overrides {
fn(&cfg)
}
return cfg
}
func waitListening(t *testing.T, host string, port int, timeout time.Duration) {
t.Helper()
deadline := time.Now().Add(timeout)
for time.Now().Before(deadline) {
conn, err := net.DialTimeout("tcp", fmt.Sprintf("%s:%d", host, port), 50*time.Millisecond)
if err == nil {
conn.Close()
return
}
time.Sleep(20 * time.Millisecond)
}
t.Fatalf("server on port %d did not start within %v", port, timeout)
}
func doRequest(t *testing.T, method, url, body string, headers map[string]string) (int, string) {
t.Helper()
var reqBody io.Reader
if body != "" {
reqBody = strings.NewReader(body)
}
req, err := http.NewRequest(method, url, reqBody)
if err != nil {
t.Fatal(err)
}
if body != "" {
req.Header.Set("Content-Type", "application/json")
}
for k, v := range headers {
req.Header.Set(k, v)
}
resp, err := http.DefaultClient.Do(req)
if err != nil {
t.Fatal(err)
}
defer resp.Body.Close()
data, _ := io.ReadAll(resp.Body)
return resp.StatusCode, string(data)
}
func TestBridgeServer_Health(t *testing.T) {
port := freePort(t)
agentBin := makeFakeAgentBinWithModels(t)
cfg := makeTestConfig(agentBin, port)
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
waitListening(t, "127.0.0.1", port, 3*time.Second)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
status, body := doRequest(t, "GET", fmt.Sprintf("http://127.0.0.1:%d/health", port), "", nil)
if status != 200 {
t.Fatalf("status = %d, want 200; body: %s", status, body)
}
var result map[string]interface{}
json.Unmarshal([]byte(body), &result)
if result["ok"] != true {
t.Errorf("ok = %v, want true", result["ok"])
}
if result["version"] != "1.0.0" {
t.Errorf("version = %v, want 1.0.0", result["version"])
}
}
func TestBridgeServer_Models(t *testing.T) {
port := freePort(t)
agentBin := makeFakeAgentBinWithModels(t)
cfg := makeTestConfig(agentBin, port)
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
waitListening(t, "127.0.0.1", port, 3*time.Second)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
status, body := doRequest(t, "GET", fmt.Sprintf("http://127.0.0.1:%d/v1/models", port), "", nil)
if status != 200 {
t.Fatalf("status = %d, want 200; body: %s", status, body)
}
var result map[string]interface{}
json.Unmarshal([]byte(body), &result)
if result["object"] != "list" {
t.Errorf("object = %v, want list", result["object"])
}
data := result["data"].([]interface{})
if len(data) < 2 {
t.Errorf("data len = %d, want >= 2", len(data))
}
}
func TestBridgeServer_Unauthorized(t *testing.T) {
port := freePort(t)
agentBin := makeFakeAgentBinWithModels(t)
cfg := makeTestConfig(agentBin, port, func(c *config.BridgeConfig) {
c.RequiredKey = "secret123"
})
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
waitListening(t, "127.0.0.1", port, 3*time.Second)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
status, body := doRequest(t, "GET", fmt.Sprintf("http://127.0.0.1:%d/health", port), "", nil)
if status != 401 {
t.Fatalf("status = %d, want 401; body: %s", status, body)
}
var result map[string]interface{}
json.Unmarshal([]byte(body), &result)
errObj := result["error"].(map[string]interface{})
if errObj["message"] != "Invalid API key" {
t.Errorf("message = %v, want 'Invalid API key'", errObj["message"])
}
}
func TestBridgeServer_AuthorizedKey(t *testing.T) {
port := freePort(t)
agentBin := makeFakeAgentBinWithModels(t)
cfg := makeTestConfig(agentBin, port, func(c *config.BridgeConfig) {
c.RequiredKey = "secret123"
})
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
waitListening(t, "127.0.0.1", port, 3*time.Second)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
status, _ := doRequest(t, "GET", fmt.Sprintf("http://127.0.0.1:%d/health", port), "", map[string]string{
"Authorization": "Bearer secret123",
})
if status != 200 {
t.Errorf("status = %d, want 200", status)
}
}
func TestBridgeServer_NotFound(t *testing.T) {
port := freePort(t)
agentBin := makeFakeAgentBinWithModels(t)
cfg := makeTestConfig(agentBin, port)
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
waitListening(t, "127.0.0.1", port, 3*time.Second)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
status, body := doRequest(t, "GET", fmt.Sprintf("http://127.0.0.1:%d/unknown", port), "", nil)
if status != 404 {
t.Fatalf("status = %d, want 404; body: %s", status, body)
}
var result map[string]interface{}
json.Unmarshal([]byte(body), &result)
errObj := result["error"].(map[string]interface{})
if errObj["code"] != "not_found" {
t.Errorf("code = %v, want not_found", errObj["code"])
}
}
func TestBridgeServer_ChatCompletions_Sync(t *testing.T) {
port := freePort(t)
agentBin := makeFakeAgentBin(t, "Hello from agent")
cfg := makeTestConfig(agentBin, port)
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
waitListening(t, "127.0.0.1", port, 3*time.Second)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
reqBody := `{"model":"claude-3-opus","messages":[{"role":"user","content":"Hi"}]}`
status, body := doRequest(t, "POST", fmt.Sprintf("http://127.0.0.1:%d/v1/chat/completions", port), reqBody, nil)
if status != 200 {
t.Fatalf("status = %d, want 200; body: %s", status, body)
}
var result map[string]interface{}
json.Unmarshal([]byte(body), &result)
if result["object"] != "chat.completion" {
t.Errorf("object = %v, want chat.completion", result["object"])
}
choices := result["choices"].([]interface{})
msg := choices[0].(map[string]interface{})["message"].(map[string]interface{})
if msg["content"] != "Hello from agent" {
t.Errorf("content = %v, want 'Hello from agent'", msg["content"])
}
}
func TestBridgeServer_MultiPort(t *testing.T) {
basePort := freePort(t)
agentBin := makeFakeAgentBinWithModels(t)
dir1 := t.TempDir()
dir2 := t.TempDir()
cfg := makeTestConfig(agentBin, basePort, func(c *config.BridgeConfig) {
c.ConfigDirs = []string{dir1, dir2}
c.MultiPort = true
})
srvs := server.StartBridgeServer(server.ServerOptions{Version: "1.0.0", Config: cfg})
if len(srvs) != 2 {
t.Fatalf("got %d servers, want 2", len(srvs))
}
// 等待兩個 server 啟動port 可能會衝突,這裡不嚴格測試 port 分配)
time.Sleep(200 * time.Millisecond)
defer func() {
for _, s := range srvs {
s.Shutdown(context.Background())
}
}()
}

View File

@ -0,0 +1,28 @@
package svc
import (
"cursor-api-proxy/internal/config"
domainrepo "cursor-api-proxy/pkg/domain/repository"
"cursor-api-proxy/pkg/repository"
)
type ServiceContext struct {
Config config.Config
// Domain services
AccountPool domainrepo.AccountPool
// Last model for sticky model mode
LastModel *string
}
func NewServiceContext(c config.Config) *ServiceContext {
accountPool := repository.NewAccountPool(c.ConfigDirs)
lastModel := c.DefaultModel
return &ServiceContext{
Config: c,
AccountPool: accountPool,
LastModel: &lastModel,
}
}

133
internal/types/types.go Normal file
View File

@ -0,0 +1,133 @@
// Code generated by goctl. DO NOT EDIT.
// goctl 1.10.1
package types
type AnthropicRequest struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
MaxTokens int `json:"max_tokens"`
Stream bool `json:"stream,optional"`
System string `json:"system,optional"`
Tools []Tool `json:"tools,optional"`
}
type AnthropicResponse struct {
Id string `json:"id"`
Type string `json:"type"`
Role string `json:"role"`
Content []ContentBlock `json:"content"`
Model string `json:"model"`
Usage AnthropicUsage `json:"usage"`
}
type AnthropicUsage struct {
InputTokens int `json:"input_tokens"`
OutputTokens int `json:"output_tokens"`
}
type ChatCompletionRequest struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
Stream bool `json:"stream,optional"`
Tools []Tool `json:"tools,optional"`
Functions []Function `json:"functions,optional"`
MaxTokens int `json:"max_tokens,optional"`
Temperature float64 `json:"temperature,optional"`
}
type ChatCompletionResponse struct {
Id string `json:"id"`
Object string `json:"object"`
Created int64 `json:"created"`
Model string `json:"model"`
Choices []Choice `json:"choices"`
Usage Usage `json:"usage"`
}
type Choice struct {
Index int `json:"index"`
Message RespMessage `json:"message,optional"`
Delta Delta `json:"delta,optional"`
FinishReason string `json:"finish_reason"`
}
type ContentBlock struct {
Type string `json:"type"`
Text string `json:"text,optional"`
}
type Delta struct {
Role string `json:"role,optional"`
Content string `json:"content,optional"`
ReasoningContent string `json:"reasoning_content,optional"`
ToolCalls []ToolCall `json:"tool_calls,optional"`
}
type Function struct {
Name string `json:"name"`
Description string `json:"description,optional"`
Parameters interface{} `json:"parameters,optional"`
}
type FunctionCall struct {
Name string `json:"name"`
Arguments string `json:"arguments"`
}
type HealthRequest struct {
}
type HealthResponse struct {
Status string `json:"status"`
Version string `json:"version"`
}
type Message struct {
Role string `json:"role"`
Content interface{} `json:"content"`
}
type ModelData struct {
Id string `json:"id"`
Object string `json:"object"`
OwnedBy string `json:"owned_by"`
}
type ModelsRequest struct {
}
type ModelsResponse struct {
Object string `json:"object"`
Data []ModelData `json:"data"`
}
type RespMessage struct {
Role string `json:"role"`
Content string `json:"content,optional"`
ToolCalls []ToolCall `json:"tool_calls,optional"`
}
type Tool struct {
Type string `json:"type"`
Function ToolFunction `json:"function"`
}
type ToolCall struct {
Index int `json:"index"`
Id string `json:"id"`
Type string `json:"type"`
Function FunctionCall `json:"function"`
}
type ToolFunction struct {
Name string `json:"name"`
Description string `json:"description"`
Parameters interface{} `json:"parameters"`
}
type Usage struct {
PromptTokens int `json:"prompt_tokens"`
CompletionTokens int `json:"completion_tokens"`
TotalTokens int `json:"total_tokens"`
}

73
main.go
View File

@ -1,26 +1,56 @@
package main package main
import ( import (
"cursor-api-proxy/cmd" "flag"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/env"
"cursor-api-proxy/internal/server"
"fmt" "fmt"
"os" "os"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/handler"
"cursor-api-proxy/internal/svc"
cmd "cursor-api-proxy/cmd/cli"
"github.com/zeromicro/go-zero/core/conf"
"github.com/zeromicro/go-zero/rest"
) )
const version = "1.0.0" const version = "1.0.0"
var configFile = flag.String("f", "etc/chat-api.yaml", "the config file")
func main() { func main() {
args, err := cmd.ParseArgs(os.Args[1:]) // Check for CLI commands first (before flag.Parse)
if err != nil { args := os.Args[1:]
fmt.Fprintf(os.Stderr, "Error: %v\n", err) if len(args) > 0 {
os.Exit(1) parsed, err := cmd.ParseArgs(args)
if err != nil {
// Not a CLI command, proceed to HTTP server
} else if handleCLICommand(parsed) {
return
}
} }
// HTTP server mode (go-zero)
flag.Parse()
var c config.Config
conf.MustLoad(*configFile, &c)
server := rest.MustNewServer(c.RestConf)
defer server.Stop()
ctx := svc.NewServiceContext(c)
handler.RegisterHandlers(server, ctx)
fmt.Printf("Starting server at %s:%d...\n", c.Host, c.Port)
server.Start()
}
func handleCLICommand(args cmd.ParsedArgs) bool {
if args.Help { if args.Help {
cmd.PrintHelp(version) cmd.PrintHelp(version)
return return true
} }
if args.Login { if args.Login {
@ -28,7 +58,7 @@ func main() {
fmt.Fprintf(os.Stderr, "Error: %v\n", err) fmt.Fprintf(os.Stderr, "Error: %v\n", err)
os.Exit(1) os.Exit(1)
} }
return return true
} }
if args.Logout { if args.Logout {
@ -36,7 +66,7 @@ func main() {
fmt.Fprintf(os.Stderr, "Error: %v\n", err) fmt.Fprintf(os.Stderr, "Error: %v\n", err)
os.Exit(1) os.Exit(1)
} }
return return true
} }
if args.AccountsList { if args.AccountsList {
@ -44,7 +74,7 @@ func main() {
fmt.Fprintf(os.Stderr, "Error: %v\n", err) fmt.Fprintf(os.Stderr, "Error: %v\n", err)
os.Exit(1) os.Exit(1)
} }
return return true
} }
if args.ResetHwid { if args.ResetHwid {
@ -52,22 +82,9 @@ func main() {
fmt.Fprintf(os.Stderr, "Error: %v\n", err) fmt.Fprintf(os.Stderr, "Error: %v\n", err)
os.Exit(1) os.Exit(1)
} }
return return true
} }
e := env.OsEnvToMap() // Not a CLI command
if args.Tailscale { return false
e["CURSOR_BRIDGE_HOST"] = "0.0.0.0"
}
cwd, _ := os.Getwd()
cfg := config.LoadBridgeConfig(e, cwd)
servers := server.StartBridgeServer(server.ServerOptions{
Version: version,
Config: cfg,
})
server.SetupGracefulShutdown(servers, 10000)
select {}
} }

View File

@ -1,7 +1,7 @@
package anthropic package anthropic
import ( import (
"cursor-api-proxy/internal/openai" "cursor-api-proxy/pkg/adapter/openai"
"encoding/json" "encoding/json"
"fmt" "fmt"
"strings" "strings"

View File

@ -1,7 +1,7 @@
package anthropic_test package anthropic_test
import ( import (
"cursor-api-proxy/internal/anthropic" "cursor-api-proxy/pkg/adapter/anthropic"
"strings" "strings"
"testing" "testing"
) )

View File

@ -0,0 +1,22 @@
package entity
// Account represents an account in the pool
type Account struct {
ConfigDir string
ActiveRequests int
LastUsed int64
RateLimitUntil int64
}
// AccountStat represents account statistics
type AccountStat struct {
ConfigDir string
ActiveRequests int
TotalRequests int
TotalSuccess int
TotalErrors int
TotalRateLimits int
TotalLatencyMs int64
IsRateLimited bool
RateLimitUntil int64
}

View File

@ -0,0 +1,20 @@
package entity
// ChunkType represents the type of stream chunk
type ChunkType int
const (
ChunkText ChunkType = iota
ChunkThinking
ChunkToolCall
ChunkDone
)
// StreamChunk represents a chunk in SSE streaming
type StreamChunk struct {
Type ChunkType
Text string
Thinking string
ToolCall *ToolCall
Done bool
}

View File

@ -0,0 +1,33 @@
package entity
// Message represents a chat message
type Message struct {
Role string
Content interface{}
}
// Tool represents a tool definition
type Tool struct {
Type string
Function ToolFunction
}
// ToolFunction represents a tool function definition
type ToolFunction struct {
Name string
Description string
Parameters interface{}
}
// ToolCall represents a tool call result
type ToolCall struct {
ID string
Name string
Arguments string
}
// FunctionCall represents a function call
type FunctionCall struct {
Name string
Arguments string
}

View File

@ -0,0 +1,27 @@
package repository
import (
"context"
"cursor-api-proxy/pkg/domain/entity"
)
// AccountPool defines the interface for account pool management
type AccountPool interface {
GetNextConfigDir() string
ReportRequestStart(configDir string)
ReportRequestEnd(configDir string)
ReportRequestSuccess(configDir string, latencyMs int64)
ReportRequestError(configDir string, latencyMs int64)
ReportRateLimit(configDir string, penaltyMs int64)
GetStats() []entity.AccountStat
Count() int
}
// Provider defines the interface for AI providers
type Provider interface {
Name() string
Generate(ctx context.Context, model string, messages []entity.Message,
tools []entity.Tool, callback func(entity.StreamChunk)) error
Close() error
}

View File

@ -0,0 +1,13 @@
package types
import "errors"
var (
ErrInvalidRequest = errors.New("invalid request")
ErrProviderNotFound = errors.New("provider not found")
ErrAccountExhausted = errors.New("all accounts exhausted")
ErrRateLimited = errors.New("rate limited")
ErrTimeout = errors.New("request timeout")
ErrClientDisconnect = errors.New("client disconnected")
ErrAgentError = errors.New("agent execution error")
)

View File

@ -1,10 +1,25 @@
package models package types
import ( import (
"fmt"
"os"
"regexp" "regexp"
"strings" "strings"
"cursor-api-proxy/pkg/infrastructure/process"
) )
type CursorCliModel struct {
ID string
Name string
}
type ModelAlias struct {
CursorID string
AnthropicID string
Name string
}
var anthropicToCursor = map[string]string{ var anthropicToCursor = map[string]string{
"claude-opus-4-6": "opus-4.6", "claude-opus-4-6": "opus-4.6",
"claude-opus-4.6": "opus-4.6", "claude-opus-4.6": "opus-4.6",
@ -24,12 +39,12 @@ var anthropicToCursor = map[string]string{
"claude-sonnet-4-6-thinking": "sonnet-4.6-thinking", "claude-sonnet-4-6-thinking": "sonnet-4.6-thinking",
"claude-opus-4-5-thinking": "opus-4.5-thinking", "claude-opus-4-5-thinking": "opus-4.5-thinking",
"claude-sonnet-4-5-thinking": "sonnet-4.5-thinking", "claude-sonnet-4-5-thinking": "sonnet-4.5-thinking",
} "claude-3-5-sonnet": "claude-3.5-sonnet",
"claude-3-5-sonnet-20241022": "claude-3.5-sonnet",
type ModelAlias struct { "claude-3-5-haiku": "claude-3.5-haiku",
CursorID string "claude-3-opus": "claude-3-opus",
AnthropicID string "claude-3-sonnet": "claude-3-sonnet",
Name string "claude-3-haiku": "claude-3-haiku",
} }
var cursorToAnthropicAlias = []ModelAlias{ var cursorToAnthropicAlias = []ModelAlias{
@ -43,13 +58,61 @@ var cursorToAnthropicAlias = []ModelAlias{
{"sonnet-4.5-thinking", "claude-sonnet-4-5-thinking", "Claude 4.5 Sonnet (Thinking)"}, {"sonnet-4.5-thinking", "claude-sonnet-4-5-thinking", "Claude 4.5 Sonnet (Thinking)"},
} }
// cursorModelPattern matches cursor model IDs like "opus-4.6", "sonnet-4.7-thinking". var modelLineRe = regexp.MustCompile(`^([A-Za-z0-9][A-Za-z0-9._:/-]*)\s+-\s+(.*)$`)
var trailingParenRe = regexp.MustCompile(`\s*\([^)]*\)\s*$`)
var cursorModelPattern = regexp.MustCompile(`^([a-zA-Z]+)-(\d+)\.(\d+)(-thinking)?$`) var cursorModelPattern = regexp.MustCompile(`^([a-zA-Z]+)-(\d+)\.(\d+)(-thinking)?$`)
// reverseDynamicPattern matches dynamically generated anthropic aliases
// like "claude-opus-4-7", "claude-sonnet-4-7-thinking".
var reverseDynamicPattern = regexp.MustCompile(`^claude-([a-zA-Z]+)-(\d+)-(\d+)(-thinking)?$`) var reverseDynamicPattern = regexp.MustCompile(`^claude-([a-zA-Z]+)-(\d+)-(\d+)(-thinking)?$`)
type AnthropicAlias struct {
ID string
Name string
}
func ParseCursorCliModels(output string) []CursorCliModel {
lines := strings.Split(output, "\n")
seen := make(map[string]CursorCliModel)
var order []string
for _, line := range lines {
line = strings.TrimSpace(line)
m := modelLineRe.FindStringSubmatch(line)
if m == nil {
continue
}
id := m[1]
rawName := m[2]
name := strings.TrimSpace(trailingParenRe.ReplaceAllString(rawName, ""))
if name == "" {
name = id
}
if _, exists := seen[id]; !exists {
seen[id] = CursorCliModel{ID: id, Name: name}
order = append(order, id)
}
}
result := make([]CursorCliModel, 0, len(order))
for _, id := range order {
result = append(result, seen[id])
}
return result
}
func ListCursorCliModels(agentBin string, timeoutMs int) ([]CursorCliModel, error) {
tmpDir := os.TempDir()
result, err := process.Run(agentBin, []string{"--print-models_oneline"}, process.RunOptions{
Cwd: tmpDir,
TimeoutMs: timeoutMs,
})
if err != nil {
return nil, err
}
if result.Code != 0 {
return nil, fmt.Errorf("cursor cli failed: %s", result.Stderr)
}
return ParseCursorCliModels(result.Stdout), nil
}
func generateDynamicAlias(cursorID string) (AnthropicAlias, bool) { func generateDynamicAlias(cursorID string) (AnthropicAlias, bool) {
m := cursorModelPattern.FindStringSubmatch(cursorID) m := cursorModelPattern.FindStringSubmatch(cursorID)
if m == nil { if m == nil {
@ -78,41 +141,29 @@ func reverseDynamicAlias(anthropicID string) (string, bool) {
} }
func ResolveToCursorModel(requested string) string { func ResolveToCursorModel(requested string) string {
if strings.TrimSpace(requested) == "" { if mapped, ok := anthropicToCursor[requested]; ok {
return "" return mapped
} }
key := strings.ToLower(strings.TrimSpace(requested)) if cursorID, ok := reverseDynamicAlias(requested); ok {
if v, ok := anthropicToCursor[key]; ok { return cursorID
return v
} }
if v, ok := reverseDynamicAlias(key); ok { return requested
return v
}
return strings.TrimSpace(requested)
} }
type AnthropicAlias struct { func GetAnthropicModelAliases(cursorIDs []string) []AnthropicAlias {
ID string result := make([]AnthropicAlias, 0, len(cursorToAnthropicAlias)+len(cursorIDs))
Name string seen := make(map[string]bool)
}
func GetAnthropicModelAliases(availableCursorIDs []string) []AnthropicAlias {
set := make(map[string]bool, len(availableCursorIDs))
for _, id := range availableCursorIDs {
set[id] = true
}
staticSet := make(map[string]bool, len(cursorToAnthropicAlias))
var result []AnthropicAlias
for _, a := range cursorToAnthropicAlias { for _, a := range cursorToAnthropicAlias {
if set[a.CursorID] { result = append(result, AnthropicAlias{
staticSet[a.CursorID] = true ID: a.AnthropicID,
result = append(result, AnthropicAlias{ID: a.AnthropicID, Name: a.Name}) Name: a.Name,
} })
seen[a.CursorID] = true
} }
for _, id := range availableCursorIDs { for _, id := range cursorIDs {
if staticSet[id] { if seen[id] {
continue continue
} }
if alias, ok := generateDynamicAlias(id); ok { if alias, ok := generateDynamicAlias(id); ok {

View File

@ -0,0 +1,47 @@
package usecase
import (
"context"
"cursor-api-proxy/pkg/domain/entity"
)
// ChatUsecase defines the interface for chat operations
type ChatUsecase interface {
Execute(ctx context.Context, input ChatInput) (ChatOutput, error)
Stream(ctx context.Context, input ChatInput, callback func(entity.StreamChunk)) error
}
// ChatInput represents the input for chat operations
type ChatInput struct {
Model string
Messages []entity.Message
Tools []entity.Tool
Stream bool
}
// ChatOutput represents the output from chat operations
type ChatOutput struct {
Content string
Thinking string
ToolCalls []entity.ToolCall
}
// AgentRunner defines the interface for running AI agents
type AgentRunner interface {
RunSync(ctx context.Context, config interface{}, args []string) (RunResult, error)
RunStream(ctx context.Context, config interface{}, args []string, onLine func(string)) (StreamResult, error)
}
// RunResult represents the result of a synchronous agent run
type RunResult struct {
Code int
Stdout string
Stderr string
}
// StreamResult represents the result of a streaming agent run
type StreamResult struct {
Code int
Stderr string
}

View File

@ -1,13 +1,14 @@
package logger package logger
import ( import (
"cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/pool"
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
"strings" "strings"
"time" "time"
"cursor-api-proxy/internal/config"
"cursor-api-proxy/pkg/domain/entity"
) )
const ( const (
@ -192,7 +193,7 @@ func LogAccountAssigned(configDir string) {
fmt.Printf("%s %s→%s account %s%s%s\n", ts(), cBCyan, cReset, cBold, name, cReset) fmt.Printf("%s %s→%s account %s%s%s\n", ts(), cBCyan, cReset, cBold, name, cReset)
} }
func LogAccountStats(verbose bool, stats []pool.AccountStat) { func LogAccountStats(verbose bool, stats []entity.AccountStat) {
if !verbose || len(stats) == 0 { if !verbose || len(stats) == 0 {
return return
} }

View File

@ -2,7 +2,7 @@ package process_test
import ( import (
"context" "context"
"cursor-api-proxy/internal/process" "cursor-api-proxy/pkg/infrastructure/process"
"os" "os"
"testing" "testing"
"time" "time"

View File

@ -3,7 +3,7 @@ package process
import ( import (
"bufio" "bufio"
"context" "context"
"cursor-api-proxy/internal/env" "cursor-api-proxy/pkg/infrastructure/env"
"fmt" "fmt"
"os/exec" "os/exec"
"strings" "strings"

View File

@ -1,7 +1,7 @@
package winlimit package winlimit
import ( import (
"cursor-api-proxy/internal/env" "cursor-api-proxy/pkg/infrastructure/env"
"runtime" "runtime"
) )

View File

@ -2,7 +2,7 @@ package cursor
import ( import (
"context" "context"
"cursor-api-proxy/internal/apitypes" "cursor-api-proxy/pkg/domain/entity"
"cursor-api-proxy/internal/config" "cursor-api-proxy/internal/config"
) )
@ -22,6 +22,6 @@ func (p *Provider) Close() error {
return nil return nil
} }
func (p *Provider) Generate(ctx context.Context, model string, messages []apitypes.Message, tools []apitypes.Tool, cb func(apitypes.StreamChunk)) error { func (p *Provider) Generate(ctx context.Context, model string, messages []entity.Message, tools []entity.Tool, cb func(entity.StreamChunk)) error {
return nil return nil
} }

View File

@ -2,8 +2,8 @@ package geminiweb
import ( import (
"context" "context"
"cursor-api-proxy/internal/apitypes"
"cursor-api-proxy/internal/config" "cursor-api-proxy/internal/config"
"cursor-api-proxy/pkg/domain/entity"
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
@ -114,7 +114,7 @@ func (p *PlaywrightProvider) launchIfNeeded() error {
} }
// Generate 生成回應 // Generate 生成回應
func (p *PlaywrightProvider) Generate(ctx context.Context, model string, messages []apitypes.Message, tools []apitypes.Tool, cb func(apitypes.StreamChunk)) (err error) { func (p *PlaywrightProvider) Generate(ctx context.Context, model string, messages []entity.Message, tools []entity.Tool, cb func(entity.StreamChunk)) (err error) {
// 確保在返回錯誤時保存診斷 // 確保在返回錯誤時保存診斷
defer func() { defer func() {
if err != nil { if err != nil {
@ -182,7 +182,7 @@ func (p *PlaywrightProvider) Generate(ctx context.Context, model string, message
fmt.Println("Browser is open. You can:") fmt.Println("Browser is open. You can:")
fmt.Println("1. Log in to Gemini now") fmt.Println("1. Log in to Gemini now")
fmt.Println("2. Continue without login") fmt.Println("2. Continue without login")
fmt.Println("========================================\n") fmt.Println("========================================")
} }
} else { } else {
fmt.Println("[GeminiWeb] ✓ Logged in") fmt.Println("[GeminiWeb] ✓ Logged in")
@ -216,8 +216,8 @@ func (p *PlaywrightProvider) Generate(ctx context.Context, model string, message
} }
// 9. 回調 // 9. 回調
cb(apitypes.StreamChunk{Type: apitypes.ChunkText, Text: response}) cb(entity.StreamChunk{Type: entity.ChunkText, Text: response})
cb(apitypes.StreamChunk{Type: apitypes.ChunkDone, Done: true}) cb(entity.StreamChunk{Type: entity.ChunkDone, Done: true})
fmt.Printf("[GeminiWeb] Response complete (%d chars)\n", len(response)) fmt.Printf("[GeminiWeb] Response complete (%d chars)\n", len(response))
return nil return nil
@ -624,18 +624,39 @@ func (p *PlaywrightProvider) selectModel(model string) error {
return nil return nil
} }
// buildPromptFromMessages 從訊息列表建構提示詞 // buildPromptFromMessagesPlaywright 從訊息列表建構提示詞
func buildPromptFromMessagesPlaywright(messages []apitypes.Message) string { func buildPromptFromMessagesPlaywright(messages []entity.Message) string {
var prompt string var prompt string
for _, m := range messages { for _, m := range messages {
content := messageContentToStringPlaywright(m.Content)
switch m.Role { switch m.Role {
case "system": case "system":
prompt += "System: " + m.Content + "\n\n" prompt += "System: " + content + "\n\n"
case "user": case "user":
prompt += m.Content + "\n\n" prompt += content + "\n\n"
case "assistant": case "assistant":
prompt += "Assistant: " + m.Content + "\n\n" prompt += "Assistant: " + content + "\n\n"
} }
} }
return prompt return prompt
} }
// messageContentToStringPlaywright converts Message.Content to string
func messageContentToStringPlaywright(content interface{}) string {
switch v := content.(type) {
case string:
return v
case []interface{}:
var result string
for _, item := range v {
if m, ok := item.(map[string]interface{}); ok {
if text, ok := m["text"].(string); ok {
result += text
}
}
}
return result
default:
return ""
}
}

View File

@ -2,8 +2,8 @@ package geminiweb
import ( import (
"context" "context"
"cursor-api-proxy/internal/apitypes"
"cursor-api-proxy/internal/config" "cursor-api-proxy/internal/config"
"cursor-api-proxy/pkg/domain/entity"
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
@ -54,7 +54,7 @@ func (p *Provider) getSessionDir() string {
} }
// Generate 生成回應 // Generate 生成回應
func (p *Provider) Generate(ctx context.Context, model string, messages []apitypes.Message, tools []apitypes.Tool, cb func(apitypes.StreamChunk)) error { func (p *Provider) Generate(ctx context.Context, model string, messages []entity.Message, tools []entity.Tool, cb func(entity.StreamChunk)) error {
fmt.Printf("[GeminiWeb] Starting generation with model: %s\n", model) fmt.Printf("[GeminiWeb] Starting generation with model: %s\n", model)
// 1. 獲取瀏覽器管理器 // 1. 獲取瀏覽器管理器
@ -97,7 +97,7 @@ func (p *Provider) Generate(ctx context.Context, model string, messages []apityp
fmt.Println("Browser is open. You can:") fmt.Println("Browser is open. You can:")
fmt.Println("1. Log in to Gemini now") fmt.Println("1. Log in to Gemini now")
fmt.Println("2. Continue without login") fmt.Println("2. Continue without login")
fmt.Println("========================================\n") fmt.Println("========================================")
} }
} else { } else {
fmt.Printf("[GeminiWeb] Logged in\n") fmt.Printf("[GeminiWeb] Logged in\n")
@ -131,29 +131,51 @@ func (p *Provider) Generate(ctx context.Context, model string, messages []apityp
} }
// 11. 串流回調 // 11. 串流回調
cb(apitypes.StreamChunk{Type: apitypes.ChunkText, Text: response}) cb(entity.StreamChunk{Type: entity.ChunkText, Text: response})
cb(apitypes.StreamChunk{Type: apitypes.ChunkDone, Done: true}) cb(entity.StreamChunk{Type: entity.ChunkDone, Done: true})
fmt.Printf("[GeminiWeb] Response complete (%d chars)\n", len(response)) fmt.Printf("[GeminiWeb] Response complete (%d chars)\n", len(response))
return nil return nil
} }
// buildPromptFromMessages 從訊息列表建構提示詞 // buildPromptFromMessages 從訊息列表建構提示詞
func buildPromptFromMessages(messages []apitypes.Message) string { func buildPromptFromMessages(messages []entity.Message) string {
var prompt string var prompt string
for _, m := range messages { for _, m := range messages {
content := messageContentToString(m.Content)
switch m.Role { switch m.Role {
case "system": case "system":
prompt += "System: " + m.Content + "\n\n" prompt += "System: " + content + "\n\n"
case "user": case "user":
prompt += m.Content + "\n\n" prompt += content + "\n\n"
case "assistant": case "assistant":
prompt += "Assistant: " + m.Content + "\n\n" prompt += "Assistant: " + content + "\n\n"
} }
} }
return prompt return prompt
} }
// messageContentToString converts Message.Content to string
func messageContentToString(content interface{}) string {
switch v := content.(type) {
case string:
return v
case []interface{}:
// Handle array content (multimodal)
var result string
for _, item := range v {
if m, ok := item.(map[string]interface{}); ok {
if text, ok := m["text"].(string); ok {
result += text
}
}
}
return result
default:
return ""
}
}
// RunLogin 執行登入流程(供 gemini-login 命令使用) // RunLogin 執行登入流程(供 gemini-login 命令使用)
func RunLogin(cfg config.BridgeConfig, sessionName string) error { func RunLogin(cfg config.BridgeConfig, sessionName string) error {
if sessionName == "" { if sessionName == "" {

View File

@ -1,32 +1,22 @@
package pool package repository
import ( import (
"sync" "sync"
"time" "time"
"cursor-api-proxy/pkg/domain/entity"
) )
type accountStatus struct { type accountStatus struct {
configDir string configDir string
activeRequests int activeRequests int
lastUsed int64 lastUsed int64
rateLimitUntil int64 rateLimitUntil int64
totalRequests int totalRequests int
totalSuccess int totalSuccess int
totalErrors int totalErrors int
totalRateLimits int totalRateLimits int
totalLatencyMs int64 totalLatencyMs int64
}
type AccountStat struct {
ConfigDir string
ActiveRequests int
TotalRequests int
TotalSuccess int
TotalErrors int
TotalRateLimits int
TotalLatencyMs int64
IsRateLimited bool
RateLimitUntil int64
} }
type AccountPool struct { type AccountPool struct {
@ -155,13 +145,13 @@ func (p *AccountPool) ReportRateLimit(configDir string, penaltyMs int64) {
} }
} }
func (p *AccountPool) GetStats() []AccountStat { func (p *AccountPool) GetStats() []entity.AccountStat {
p.mu.Lock() p.mu.Lock()
defer p.mu.Unlock() defer p.mu.Unlock()
now := time.Now().UnixMilli() now := time.Now().UnixMilli()
stats := make([]AccountStat, len(p.accounts)) stats := make([]entity.AccountStat, len(p.accounts))
for i, a := range p.accounts { for i, a := range p.accounts {
stats[i] = AccountStat{ stats[i] = entity.AccountStat{
ConfigDir: a.configDir, ConfigDir: a.configDir,
ActiveRequests: a.activeRequests, ActiveRequests: a.activeRequests,
TotalRequests: a.totalRequests, TotalRequests: a.totalRequests,
@ -180,7 +170,6 @@ func (p *AccountPool) Count() int {
return len(p.accounts) return len(p.accounts)
} }
// ─── PoolHandle interface ────────────────────────────────────────────────── // ─── PoolHandle interface ──────────────────────────────────────────────────
// PoolHandle 讓 handler 可以注入獨立的 pool 實例,避免多 port 模式共用全域 pool。 // PoolHandle 讓 handler 可以注入獨立的 pool 實例,避免多 port 模式共用全域 pool。
@ -191,19 +180,19 @@ type PoolHandle interface {
ReportRequestSuccess(configDir string, latencyMs int64) ReportRequestSuccess(configDir string, latencyMs int64)
ReportRequestError(configDir string, latencyMs int64) ReportRequestError(configDir string, latencyMs int64)
ReportRateLimit(configDir string, penaltyMs int64) ReportRateLimit(configDir string, penaltyMs int64)
GetStats() []AccountStat GetStats() []entity.AccountStat
} }
// GlobalPoolHandle 包裝全域函式以實作 PoolHandle 介面(單 port 模式使用) // GlobalPoolHandle 包裝全域函式以實作 PoolHandle 介面(單 port 模式使用)
type GlobalPoolHandle struct{} type GlobalPoolHandle struct{}
func (GlobalPoolHandle) GetNextConfigDir() string { return GetNextAccountConfigDir() } func (GlobalPoolHandle) GetNextConfigDir() string { return GetNextAccountConfigDir() }
func (GlobalPoolHandle) ReportRequestStart(d string) { ReportRequestStart(d) } func (GlobalPoolHandle) ReportRequestStart(d string) { ReportRequestStart(d) }
func (GlobalPoolHandle) ReportRequestEnd(d string) { ReportRequestEnd(d) } func (GlobalPoolHandle) ReportRequestEnd(d string) { ReportRequestEnd(d) }
func (GlobalPoolHandle) ReportRequestSuccess(d string, l int64) { ReportRequestSuccess(d, l) } func (GlobalPoolHandle) ReportRequestSuccess(d string, l int64) { ReportRequestSuccess(d, l) }
func (GlobalPoolHandle) ReportRequestError(d string, l int64) { ReportRequestError(d, l) } func (GlobalPoolHandle) ReportRequestError(d string, l int64) { ReportRequestError(d, l) }
func (GlobalPoolHandle) ReportRateLimit(d string, p int64) { ReportRateLimit(d, p) } func (GlobalPoolHandle) ReportRateLimit(d string, p int64) { ReportRateLimit(d, p) }
func (GlobalPoolHandle) GetStats() []AccountStat { return GetAccountStats() } func (GlobalPoolHandle) GetStats() []entity.AccountStat { return GetAccountStats() }
// ─── Global pool ─────────────────────────────────────────────────────────── // ─── Global pool ───────────────────────────────────────────────────────────
@ -273,7 +262,7 @@ func ReportRateLimit(configDir string, penaltyMs int64) {
} }
} }
func GetAccountStats() []AccountStat { func GetAccountStats() []entity.AccountStat {
globalMu.Lock() globalMu.Lock()
p := globalPool p := globalPool
globalMu.Unlock() globalMu.Unlock()

View File

@ -1,4 +1,4 @@
package pool package repository
import ( import (
"testing" "testing"

View File

@ -1,4 +1,4 @@
package agent package usecase
import "cursor-api-proxy/internal/config" import "cursor-api-proxy/internal/config"

View File

@ -1,4 +1,4 @@
package agent package usecase
import ( import (
"encoding/json" "encoding/json"

View File

@ -1,9 +1,9 @@
package agent package usecase
import ( import (
"context" "context"
"cursor-api-proxy/internal/config" "cursor-api-proxy/internal/config"
"cursor-api-proxy/internal/process" "cursor-api-proxy/pkg/infrastructure/process"
"os" "os"
"path/filepath" "path/filepath"
) )

View File

@ -1,4 +1,4 @@
package sanitize package usecase
import "regexp" import "regexp"

View File

@ -1,4 +1,4 @@
package sanitize package usecase
import ( import (
"strings" "strings"

View File

@ -1,4 +1,4 @@
package agent package usecase
import ( import (
"os" "os"

View File

@ -1,4 +1,4 @@
package toolcall package usecase
import ( import (
"encoding/json" "encoding/json"

View File

@ -1,7 +1,7 @@
package main package main
import ( import (
"cursor-api-proxy/internal/providers/geminiweb" "cursor-api-proxy/pkg/provider/geminiweb"
"fmt" "fmt"
"os" "os"
@ -92,7 +92,7 @@ func analyzeDOM(page *rod.Page) {
ariaLabel, _ := el.Attribute("aria-label") ariaLabel, _ := el.Attribute("aria-label")
placeholder, _ := el.Attribute("placeholder") placeholder, _ := el.Attribute("placeholder")
fmt.Printf(" [%d] tag=%s class=%s aria-label=%s placeholder=%s\n", fmt.Printf(" [%d] tag=%s class=%s aria-label=%s placeholder=%s\n",
i, tag, class, ariaLabel, placeholder) i, tag, ptrToStr(class), ptrToStr(ariaLabel), ptrToStr(placeholder))
} }
} }
} }
@ -120,7 +120,7 @@ func analyzeDOM(page *rod.Page) {
text, _ := el.Text() text, _ := el.Text()
text = truncate(text, 30) text = truncate(text, 30)
fmt.Printf(" [%d] tag=%s class=%s aria-label=%s text=%s\n", fmt.Printf(" [%d] tag=%s class=%s aria-label=%s text=%s\n",
i, tag, class, ariaLabel, text) i, tag, ptrToStr(class), ptrToStr(ariaLabel), text)
} }
} }
} }
@ -145,7 +145,7 @@ func analyzeDOM(page *rod.Page) {
ariaLabel, _ := el.Attribute("aria-label") ariaLabel, _ := el.Attribute("aria-label")
text, _ := el.Text() text, _ := el.Text()
fmt.Printf(" [%d] tag=%s class=%s aria-label=%s text=%s\n", fmt.Printf(" [%d] tag=%s class=%s aria-label=%s text=%s\n",
i, tag, class, ariaLabel, truncate(text, 30)) i, tag, ptrToStr(class), ptrToStr(ariaLabel), truncate(text, 30))
} }
} }
} }
@ -157,3 +157,10 @@ func truncate(s string, max int) string {
} }
return s[:max] + "..." return s[:max] + "..."
} }
func ptrToStr(s *string) string {
if s == nil {
return "<nil>"
}
return *s
}