opencode-cursor-agent/DIAGNOSTIC_RESULTS.md

3.8 KiB

cursor-adapter SSE Diagnostic Results

Status: RESOLVED 2026-04-18. This file is kept for history. The bugs listed below have been fixed. Current behavior is captured by regression tests in internal/server/messages_test.go and internal/converter/convert_test.go.


Originally reported (2026-04-15)

When used as an OpenAI-compatible endpoint for SDKs like @ai-sdk/openai-compatible (OpenCode), cursor-adapter had five issues:

  1. Non-streaming is completely broken — server hung for 30s and never wrote a response body.
  2. SSE content was double-JSON-encoded — each delta.content field held the entire Cursor CLI JSON line (including type:"system", type:"user", type:"result") serialized as a string, instead of plain assistant text.
  3. Missing role in first delta.
  4. Missing finish_reason in final chunk.
  5. Usage not at the top level — embedded inside a stringified JSON payload instead of chunk.usage.

Root cause

Two separate bugs plus one latent one, all landed together:

  • Non-stream hang / exit status 1: the chat-only isolation ported from cursor-api-proxy was overriding HOME → temp dir. On macOS with keychain login, the agent CLI resolves its session token via ~/.cursor/ + the real keychain, so a fake HOME made agent exit immediately with "Authentication required. Please run 'agent login'". The adapter surfaced this as either a hang (when timeouts swallowed the exit) or as exit status 1 once the error bubbled up.
  • Content wrapping / leaked system-user chunks: older pre-parser code forwarded raw Cursor JSON lines as delta.content. The parser had already been rewritten by the time this diagnostic was taken, but the report caught an earlier build.
  • Duplicate final delta (discovered during this pass): the stream parser's accumulator was reassigned (p.accumulated = content) even when the new fragment did not start with the accumulated prefix. With Cursor CLI's incremental output mode (one fragment per message), that meant the "you said the full text" final message looked different from accumulated and was emitted as a second copy of the whole response.

Fix summary

  • internal/workspace/workspace.go — only override CURSOR_CONFIG_DIR by default. HOME/XDG_CONFIG_HOME/APPDATA are only isolated when CURSOR_API_KEY is set (which bypasses keychain auth anyway).
  • internal/converter/convert.go — stream parser now handles both cumulative and incremental Cursor output modes. In the non-prefix branch it appends to accumulated instead of replacing it, so the final duplicate is correctly detected via content == accumulated and skipped.
  • internal/server/handlers.go + anthropic_handlers.go — already emit role:"assistant" in the first delta, finish_reason:"stop" in the final chunk, and usage at the top level. Regression tests added to messages_test.go lock this in.

Verified manually

$ curl -sN http://localhost:8765/v1/chat/completions \
    -H 'Content-Type: application/json' \
    -d '{"model":"auto","stream":true,"messages":[{"role":"user","content":"count 1 to 5"}]}'

data: {"id":"chatcmpl-…","choices":[{"index":0,"delta":{"role":"assistant"},"finish_reason":null}]}
data: {"id":"chatcmpl-…","choices":[{"index":0,"delta":{"content":"\n1、2、3、4、5。"},"finish_reason":null}]}
data: {"id":"chatcmpl-…","choices":[{"index":0,"delta":{},"finish_reason":"stop"}],"usage":{"prompt_tokens":6,"completion_tokens":5,"total_tokens":11}}
data: [DONE]

Non-streaming returns chat.completion JSON with stop_reason:"stop" and usage populated. Anthropic /v1/messages emits message_startcontent_block_delta*message_stop without duplicating the final cumulative fragment.