first commit
This commit is contained in:
232
docs/opencode-integration-summary.md
Normal file
232
docs/opencode-integration-summary.md
Normal file
@@ -0,0 +1,232 @@
|
||||
# OpenCode Runtime Integration — Summary
|
||||
|
||||
> Integration of OpenCode CLI as the agent runtime for Aetheel.
|
||||
> Completed: 2026-02-13
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
OpenCode CLI has been integrated as the AI "brain" for Aetheel, replacing the placeholder `smart_handler` with a full agent runtime. The architecture is directly inspired by OpenClaw's `cli-runner.ts` and `cli-backends.ts`, adapted for OpenCode's API and Python.
|
||||
|
||||
---
|
||||
|
||||
## Files Created & Modified
|
||||
|
||||
### New Files
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `agent/__init__.py` | Package init for the agent module |
|
||||
| `agent/opencode_runtime.py` | Core runtime — ~750 lines covering both CLI and SDK modes |
|
||||
| `docs/opencode-setup.md` | Comprehensive setup guide |
|
||||
| `docs/opencode-integration-summary.md` | This summary document |
|
||||
|
||||
### Modified Files
|
||||
|
||||
| File | Change |
|
||||
|------|--------|
|
||||
| `main.py` | Rewired to use `ai_handler` backed by `OpenCodeRuntime` instead of placeholder `smart_handler` |
|
||||
| `.env.example` | Added all OpenCode config variables |
|
||||
| `requirements.txt` | Added optional `opencode-ai` SDK dependency note |
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
Slack Message → ai_handler() → OpenCodeRuntime.chat() → OpenCode → LLM → Response
|
||||
```
|
||||
|
||||
### Two Runtime Modes
|
||||
|
||||
1. **CLI Mode** (default) — Spawns `opencode run` as a subprocess per request.
|
||||
Direct port of OpenClaw's `runCliAgent()` → `runCommandWithTimeout()` pattern
|
||||
from `cli-runner.ts`.
|
||||
|
||||
2. **SDK Mode** — Connects to `opencode serve` via the official Python SDK
|
||||
(`opencode-ai`). Uses `client.session.create()` → `client.session.chat()`
|
||||
for lower latency and better session management.
|
||||
|
||||
### Component Diagram
|
||||
|
||||
```
|
||||
┌─────────────────────┐
|
||||
│ Slack │
|
||||
│ (messages) │
|
||||
└──────┬──────────────┘
|
||||
│ WebSocket
|
||||
│
|
||||
┌──────▼──────────────┐
|
||||
│ Slack Adapter │
|
||||
│ (slack_adapter.py) │
|
||||
│ │
|
||||
│ • Socket Mode │
|
||||
│ • Event handling │
|
||||
│ • Thread isolation │
|
||||
└──────┬──────────────┘
|
||||
│ ai_handler()
|
||||
│
|
||||
┌──────▼──────────────┐
|
||||
│ OpenCode Runtime │
|
||||
│ (opencode_runtime) │
|
||||
│ │
|
||||
│ • Session store │
|
||||
│ • System prompt │
|
||||
│ • Mode routing │
|
||||
└──────┬──────────────┘
|
||||
│
|
||||
┌────┴────┐
|
||||
│ │
|
||||
▼ ▼
|
||||
CLI Mode SDK Mode
|
||||
|
||||
┌──────────┐ ┌──────────────┐
|
||||
│ opencode │ │ opencode │
|
||||
│ run │ │ serve API │
|
||||
│ (subproc)│ │ (HTTP/SDK) │
|
||||
└──────────┘ └──────────────┘
|
||||
│ │
|
||||
└──────┬───────┘
|
||||
│
|
||||
┌──────▼──────┐
|
||||
│ LLM │
|
||||
│ (Anthropic, │
|
||||
│ OpenAI, │
|
||||
│ Gemini) │
|
||||
└─────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Key Components (OpenClaw → Aetheel Mapping)
|
||||
|
||||
| OpenClaw (`cli-runner.ts`) | Aetheel (`opencode_runtime.py`) |
|
||||
|---|---|
|
||||
| `CliBackendConfig` | `OpenCodeConfig` dataclass |
|
||||
| `runCliAgent()` | `OpenCodeRuntime.chat()` |
|
||||
| `buildCliArgs()` | `_build_cli_args()` |
|
||||
| `runCommandWithTimeout()` | `subprocess.run(timeout=...)` |
|
||||
| `parseCliJson()` / `collectText()` | `_parse_cli_output()` / `_collect_text()` |
|
||||
| `pickSessionId()` | `_extract_session_id()` |
|
||||
| `buildSystemPrompt()` | `build_aetheel_system_prompt()` |
|
||||
| Session per thread | `SessionStore` (thread_ts → session_id) |
|
||||
|
||||
---
|
||||
|
||||
## Key Design Decisions
|
||||
|
||||
### 1. Dual-Mode Runtime (CLI + SDK)
|
||||
- **CLI mode** is the default because it requires no persistent server — just `opencode` in PATH.
|
||||
- **SDK mode** is preferred for production because it avoids cold-start latency and provides better session management.
|
||||
- The runtime gracefully falls back from SDK → CLI if the server is unreachable or the SDK is not installed.
|
||||
|
||||
### 2. Session Isolation per Thread
|
||||
- Each Slack thread (`thread_ts`) maps to a unique OpenCode session via the `SessionStore`.
|
||||
- New threads get new sessions; replies within a thread reuse the same session.
|
||||
- Stale sessions are cleaned up after `session_ttl_hours` (default 24h).
|
||||
|
||||
### 3. System Prompt Injection
|
||||
- `build_aetheel_system_prompt()` constructs a per-message system prompt with the bot's identity, guidelines, and context (user name, channel, DM vs. mention).
|
||||
- This mirrors OpenClaw's `buildAgentSystemPrompt()` from `cli-runner/helpers.ts`.
|
||||
|
||||
### 4. Output Parsing (from OpenClaw)
|
||||
- The `_parse_cli_output()` method tries JSON → JSONL → raw text, matching OpenClaw's `parseCliJson()` and `parseCliJsonl()`.
|
||||
- The `_collect_text()` method recursively traverses JSON objects to find text content, a direct port of OpenClaw's `collectText()`.
|
||||
|
||||
### 5. Built-in Commands Bypass AI
|
||||
- Commands like `status`, `help`, `time`, and `sessions` are handled directly without calling the AI, for instant responses.
|
||||
|
||||
---
|
||||
|
||||
## Configuration Reference
|
||||
|
||||
All settings go in `.env`:
|
||||
|
||||
```env
|
||||
# Runtime mode
|
||||
OPENCODE_MODE=cli # "cli" or "sdk"
|
||||
|
||||
# Model (optional — uses OpenCode default if not set)
|
||||
OPENCODE_MODEL=anthropic/claude-sonnet-4-20250514
|
||||
|
||||
# CLI mode settings
|
||||
OPENCODE_COMMAND=opencode # path to the opencode binary
|
||||
OPENCODE_TIMEOUT=120 # seconds before timeout
|
||||
|
||||
# SDK mode settings (only needed when OPENCODE_MODE=sdk)
|
||||
OPENCODE_SERVER_URL=http://localhost:4096
|
||||
OPENCODE_SERVER_PASSWORD= # optional HTTP basic auth
|
||||
OPENCODE_SERVER_USERNAME=opencode # default username
|
||||
|
||||
# Workspace directory for OpenCode
|
||||
OPENCODE_WORKSPACE=/path/to/project
|
||||
|
||||
# Output format
|
||||
OPENCODE_FORMAT=text # "text" or "json"
|
||||
```
|
||||
|
||||
CLI flags can override config:
|
||||
|
||||
```bash
|
||||
python main.py --cli # force CLI mode
|
||||
python main.py --sdk # force SDK mode
|
||||
python main.py --model anthropic/claude-sonnet-4-20250514
|
||||
python main.py --test # echo-only (no AI)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## OpenCode Research Summary
|
||||
|
||||
### OpenCode CLI
|
||||
- **What:** Go-based AI coding agent for the terminal
|
||||
- **Install:** `curl -fsSL https://opencode.ai/install | bash` or `npm install -g opencode-ai`
|
||||
- **Key commands:**
|
||||
- `opencode` — TUI mode
|
||||
- `opencode run "prompt"` — non-interactive, returns output
|
||||
- `opencode serve` — headless HTTP server (OpenAPI 3.1 spec)
|
||||
- `opencode auth login` — configure LLM providers
|
||||
- `opencode models` — list available models
|
||||
- `opencode init` — generate `AGENTS.md` for a project
|
||||
|
||||
### OpenCode Server API (via `opencode serve`)
|
||||
- Default: `http://localhost:4096`
|
||||
- Auth: HTTP basic auth via `OPENCODE_SERVER_PASSWORD`
|
||||
- Key endpoints:
|
||||
- `GET /session` — list sessions
|
||||
- `POST /session` — create session
|
||||
- `POST /session/:id/message` — send message (returns `AssistantMessage`)
|
||||
- `POST /session/:id/abort` — abort in-progress request
|
||||
- `GET /event` — SSE event stream
|
||||
|
||||
### OpenCode Python SDK (`opencode-ai`)
|
||||
- Install: `pip install opencode-ai`
|
||||
- Key methods:
|
||||
- `client.session.create()` → `Session`
|
||||
- `client.session.chat(id, parts=[...])` → `AssistantMessage`
|
||||
- `client.session.list()` → `Session[]`
|
||||
- `client.session.abort(id)` → abort
|
||||
- `client.app.get()` → app info
|
||||
- `client.app.providers()` → available providers
|
||||
|
||||
---
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. Install OpenCode: `curl -fsSL https://opencode.ai/install | bash`
|
||||
2. Configure a provider: `opencode auth login`
|
||||
3. Test standalone: `opencode run "Hello, what are you?"`
|
||||
4. Configure `.env` (copy from `.env.example`)
|
||||
5. Run Aetheel: `python main.py`
|
||||
6. In Slack: send a message to the bot and get an AI response
|
||||
|
||||
---
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. **Memory System** — Add conversation persistence (SQLite) so sessions survive restarts
|
||||
2. **Heartbeat** — Proactive messages via cron/scheduler
|
||||
3. **Skills** — Loadable skill modules (like OpenClaw's `skills/` directory)
|
||||
4. **Multi-Channel** — Discord, Telegram adapters
|
||||
5. **Streaming** — Use SSE events from `opencode serve` for real-time streaming responses
|
||||
Reference in New Issue
Block a user