Skills engine v0.1 + multi-channel infrastructure (#307)
* refactor: multi-channel infrastructure with explicit channel/is_group tracking - Add channels[] array and findChannel() routing in index.ts, replacing hardcoded whatsapp.* calls with channel-agnostic callbacks - Add channel TEXT and is_group INTEGER columns to chats table with COALESCE upsert to protect existing values from null overwrites - is_group defaults to 0 (safe: unknown chats excluded from groups) - WhatsApp passes explicit channel='whatsapp' and isGroup to onChatMetadata - getAvailableGroups filters on is_group instead of JID pattern matching - findChannel logs warnings instead of silently dropping unroutable JIDs - Migration backfills channel/is_group from JID patterns for existing DBs Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: skills engine v0.1 — deterministic skill packages with rerere resolution Three-way merge engine for applying skill packages on top of a core codebase. Skills declare which files they add/modify, and the engine uses git merge-file for conflict detection with git rerere for automatic resolution of previously-seen conflicts. Key components: - apply: three-way merge with backup/rollback safety net - replay: clean-slate replay for uninstall and rebase - update: core version updates with deletion detection - rebase: bake applied skills into base (one-way) - manifest: validation with path traversal protection - resolution-cache: pre-computed rerere resolutions - structured: npm deps, env vars, docker-compose merging - CI: per-skill test matrix with conflict detection 151 unit tests covering merge, rerere, backup, replay, uninstall, update, rebase, structured ops, and edge cases. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add Discord and Telegram skill packages Skill packages for adding Discord and Telegram channels to NanoClaw. Each package includes: - Channel implementation (add/src/channels/) - Three-way merge targets for index.ts, config.ts, routing.test.ts - Intent docs explaining merge invariants - Standalone integration tests - manifest.yaml with dependency/conflict declarations Applied via: npx tsx scripts/apply-skill.ts .claude/skills/add-discord These are inert until applied — no runtime impact. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * remove unused docs (skills-system-status, implementation-guide) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
213
.claude/skills/add-discord/SKILL.md
Normal file
213
.claude/skills/add-discord/SKILL.md
Normal file
@@ -0,0 +1,213 @@
|
||||
# Add Discord Channel
|
||||
|
||||
This skill adds Discord support to NanoClaw using the skills engine for deterministic code changes, then walks through interactive setup.
|
||||
|
||||
## Phase 1: Pre-flight
|
||||
|
||||
### Check if already applied
|
||||
|
||||
Read `.nanoclaw/state.yaml`. If `discord` is in `applied_skills`, skip to Phase 3 (Setup). The code changes are already in place.
|
||||
|
||||
### Ask the user
|
||||
|
||||
1. **Mode**: Replace WhatsApp or add alongside it?
|
||||
- Replace → will set `DISCORD_ONLY=true`
|
||||
- Alongside → both channels active (default)
|
||||
|
||||
2. **Do they already have a bot token?** If yes, collect it now. If no, we'll create one in Phase 3.
|
||||
|
||||
## Phase 2: Apply Code Changes
|
||||
|
||||
Run the skills engine to apply this skill's code package. The package files are in this directory alongside this SKILL.md.
|
||||
|
||||
### Initialize skills system (if needed)
|
||||
|
||||
If `.nanoclaw/` directory doesn't exist yet:
|
||||
|
||||
```bash
|
||||
npx tsx scripts/apply-skill.ts --init
|
||||
```
|
||||
|
||||
Or call `initSkillsSystem()` from `skills-engine/migrate.ts`.
|
||||
|
||||
### Apply the skill
|
||||
|
||||
```bash
|
||||
npx tsx scripts/apply-skill.ts .claude/skills/add-discord
|
||||
```
|
||||
|
||||
This deterministically:
|
||||
- Adds `src/channels/discord.ts` (DiscordChannel class implementing Channel interface)
|
||||
- Adds `src/channels/discord.test.ts` (unit tests with discord.js mock)
|
||||
- Three-way merges Discord support into `src/index.ts` (multi-channel support, findChannel routing)
|
||||
- Three-way merges Discord config into `src/config.ts` (DISCORD_BOT_TOKEN, DISCORD_ONLY exports)
|
||||
- Three-way merges updated routing tests into `src/routing.test.ts`
|
||||
- Installs the `discord.js` npm dependency
|
||||
- Updates `.env.example` with `DISCORD_BOT_TOKEN` and `DISCORD_ONLY`
|
||||
- Records the application in `.nanoclaw/state.yaml`
|
||||
|
||||
If the apply reports merge conflicts, read the intent files:
|
||||
- `modify/src/index.ts.intent.md` — what changed and invariants for index.ts
|
||||
- `modify/src/config.ts.intent.md` — what changed for config.ts
|
||||
|
||||
### Validate code changes
|
||||
|
||||
```bash
|
||||
npm test
|
||||
npm run build
|
||||
```
|
||||
|
||||
All tests must pass (including the new Discord tests) and build must be clean before proceeding.
|
||||
|
||||
## Phase 3: Setup
|
||||
|
||||
### Create Discord Bot (if needed)
|
||||
|
||||
If the user doesn't have a bot token, tell them:
|
||||
|
||||
> I need you to create a Discord bot:
|
||||
>
|
||||
> 1. Go to the [Discord Developer Portal](https://discord.com/developers/applications)
|
||||
> 2. Click **New Application** and give it a name (e.g., "Andy Assistant")
|
||||
> 3. Go to the **Bot** tab on the left sidebar
|
||||
> 4. Click **Reset Token** to generate a new bot token — copy it immediately (you can only see it once)
|
||||
> 5. Under **Privileged Gateway Intents**, enable:
|
||||
> - **Message Content Intent** (required to read message text)
|
||||
> - **Server Members Intent** (optional, for member display names)
|
||||
> 6. Go to **OAuth2** > **URL Generator**:
|
||||
> - Scopes: select `bot`
|
||||
> - Bot Permissions: select `Send Messages`, `Read Message History`, `View Channels`
|
||||
> - Copy the generated URL and open it in your browser to invite the bot to your server
|
||||
|
||||
Wait for the user to provide the token.
|
||||
|
||||
### Configure environment
|
||||
|
||||
Add to `.env`:
|
||||
|
||||
```bash
|
||||
DISCORD_BOT_TOKEN=<their-token>
|
||||
```
|
||||
|
||||
If they chose to replace WhatsApp:
|
||||
|
||||
```bash
|
||||
DISCORD_ONLY=true
|
||||
```
|
||||
|
||||
Sync to container environment:
|
||||
|
||||
```bash
|
||||
cp .env data/env/env
|
||||
```
|
||||
|
||||
The container reads environment from `data/env/env`, not `.env` directly.
|
||||
|
||||
### Build and restart
|
||||
|
||||
```bash
|
||||
npm run build
|
||||
launchctl kickstart -k gui/$(id -u)/com.nanoclaw
|
||||
```
|
||||
|
||||
## Phase 4: Registration
|
||||
|
||||
### Get Channel ID
|
||||
|
||||
Tell the user:
|
||||
|
||||
> To get the channel ID for registration:
|
||||
>
|
||||
> 1. In Discord, go to **User Settings** > **Advanced** > Enable **Developer Mode**
|
||||
> 2. Right-click the text channel you want the bot to respond in
|
||||
> 3. Click **Copy Channel ID**
|
||||
>
|
||||
> The channel ID will be a long number like `1234567890123456`.
|
||||
|
||||
Wait for the user to provide the channel ID (format: `dc:1234567890123456`).
|
||||
|
||||
### Register the channel
|
||||
|
||||
Use the IPC register flow or register directly. The channel ID, name, and folder name are needed.
|
||||
|
||||
For a main channel (responds to all messages, uses the `main` folder):
|
||||
|
||||
```typescript
|
||||
registerGroup("dc:<channel-id>", {
|
||||
name: "<server-name> #<channel-name>",
|
||||
folder: "main",
|
||||
trigger: `@${ASSISTANT_NAME}`,
|
||||
added_at: new Date().toISOString(),
|
||||
requiresTrigger: false,
|
||||
});
|
||||
```
|
||||
|
||||
For additional channels (trigger-only):
|
||||
|
||||
```typescript
|
||||
registerGroup("dc:<channel-id>", {
|
||||
name: "<server-name> #<channel-name>",
|
||||
folder: "<folder-name>",
|
||||
trigger: `@${ASSISTANT_NAME}`,
|
||||
added_at: new Date().toISOString(),
|
||||
requiresTrigger: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Phase 5: Verify
|
||||
|
||||
### Test the connection
|
||||
|
||||
Tell the user:
|
||||
|
||||
> Send a message in your registered Discord channel:
|
||||
> - For main channel: Any message works
|
||||
> - For non-main: @mention the bot in Discord
|
||||
>
|
||||
> The bot should respond within a few seconds.
|
||||
|
||||
### Check logs if needed
|
||||
|
||||
```bash
|
||||
tail -f logs/nanoclaw.log
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Bot not responding
|
||||
|
||||
1. Check `DISCORD_BOT_TOKEN` is set in `.env` AND synced to `data/env/env`
|
||||
2. Check channel is registered: `sqlite3 store/messages.db "SELECT * FROM registered_groups WHERE jid LIKE 'dc:%'"`
|
||||
3. For non-main channels: message must include trigger pattern (@mention the bot)
|
||||
4. Service is running: `launchctl list | grep nanoclaw`
|
||||
5. Verify the bot has been invited to the server (check OAuth2 URL was used)
|
||||
|
||||
### Bot only responds to @mentions
|
||||
|
||||
This is the default behavior for non-main channels (`requiresTrigger: true`). To change:
|
||||
- Update the registered group's `requiresTrigger` to `false`
|
||||
- Or register the channel as the main channel
|
||||
|
||||
### Message Content Intent not enabled
|
||||
|
||||
If the bot connects but can't read messages, ensure:
|
||||
1. Go to [Discord Developer Portal](https://discord.com/developers/applications)
|
||||
2. Select your application > **Bot** tab
|
||||
3. Under **Privileged Gateway Intents**, enable **Message Content Intent**
|
||||
4. Restart NanoClaw
|
||||
|
||||
### Getting Channel ID
|
||||
|
||||
If you can't copy the channel ID:
|
||||
- Ensure **Developer Mode** is enabled: User Settings > Advanced > Developer Mode
|
||||
- Right-click the channel name in the server sidebar > Copy Channel ID
|
||||
|
||||
## After Setup
|
||||
|
||||
The Discord bot supports:
|
||||
- Text messages in registered channels
|
||||
- Attachment descriptions (images, videos, files shown as placeholders)
|
||||
- Reply context (shows who the user is replying to)
|
||||
- @mention translation (Discord `<@botId>` → NanoClaw trigger format)
|
||||
- Message splitting for responses over 2000 characters
|
||||
- Typing indicators while the agent processes
|
||||
762
.claude/skills/add-discord/add/src/channels/discord.test.ts
Normal file
762
.claude/skills/add-discord/add/src/channels/discord.test.ts
Normal file
@@ -0,0 +1,762 @@
|
||||
import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest';
|
||||
|
||||
// --- Mocks ---
|
||||
|
||||
// Mock config
|
||||
vi.mock('../config.js', () => ({
|
||||
ASSISTANT_NAME: 'Andy',
|
||||
TRIGGER_PATTERN: /^@Andy\b/i,
|
||||
}));
|
||||
|
||||
// Mock logger
|
||||
vi.mock('../logger.js', () => ({
|
||||
logger: {
|
||||
debug: vi.fn(),
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
},
|
||||
}));
|
||||
|
||||
// --- discord.js mock ---
|
||||
|
||||
type Handler = (...args: any[]) => any;
|
||||
|
||||
const clientRef = vi.hoisted(() => ({ current: null as any }));
|
||||
|
||||
vi.mock('discord.js', () => {
|
||||
const Events = {
|
||||
MessageCreate: 'messageCreate',
|
||||
ClientReady: 'ready',
|
||||
Error: 'error',
|
||||
};
|
||||
|
||||
const GatewayIntentBits = {
|
||||
Guilds: 1,
|
||||
GuildMessages: 2,
|
||||
MessageContent: 4,
|
||||
DirectMessages: 8,
|
||||
};
|
||||
|
||||
class MockClient {
|
||||
eventHandlers = new Map<string, Handler[]>();
|
||||
user: any = { id: '999888777', tag: 'Andy#1234' };
|
||||
private _ready = false;
|
||||
|
||||
constructor(_opts: any) {
|
||||
clientRef.current = this;
|
||||
}
|
||||
|
||||
on(event: string, handler: Handler) {
|
||||
const existing = this.eventHandlers.get(event) || [];
|
||||
existing.push(handler);
|
||||
this.eventHandlers.set(event, existing);
|
||||
return this;
|
||||
}
|
||||
|
||||
once(event: string, handler: Handler) {
|
||||
return this.on(event, handler);
|
||||
}
|
||||
|
||||
async login(_token: string) {
|
||||
this._ready = true;
|
||||
// Fire the ready event
|
||||
const readyHandlers = this.eventHandlers.get('ready') || [];
|
||||
for (const h of readyHandlers) {
|
||||
h({ user: this.user });
|
||||
}
|
||||
}
|
||||
|
||||
isReady() {
|
||||
return this._ready;
|
||||
}
|
||||
|
||||
channels = {
|
||||
fetch: vi.fn().mockResolvedValue({
|
||||
send: vi.fn().mockResolvedValue(undefined),
|
||||
sendTyping: vi.fn().mockResolvedValue(undefined),
|
||||
}),
|
||||
};
|
||||
|
||||
destroy() {
|
||||
this._ready = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Mock TextChannel type
|
||||
class TextChannel {}
|
||||
|
||||
return {
|
||||
Client: MockClient,
|
||||
Events,
|
||||
GatewayIntentBits,
|
||||
TextChannel,
|
||||
};
|
||||
});
|
||||
|
||||
import { DiscordChannel, DiscordChannelOpts } from './discord.js';
|
||||
|
||||
// --- Test helpers ---
|
||||
|
||||
function createTestOpts(
|
||||
overrides?: Partial<DiscordChannelOpts>,
|
||||
): DiscordChannelOpts {
|
||||
return {
|
||||
onMessage: vi.fn(),
|
||||
onChatMetadata: vi.fn(),
|
||||
registeredGroups: vi.fn(() => ({
|
||||
'dc:1234567890123456': {
|
||||
name: 'Test Server #general',
|
||||
folder: 'test-server',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
})),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function createMessage(overrides: {
|
||||
channelId?: string;
|
||||
content?: string;
|
||||
authorId?: string;
|
||||
authorUsername?: string;
|
||||
authorDisplayName?: string;
|
||||
memberDisplayName?: string;
|
||||
isBot?: boolean;
|
||||
guildName?: string;
|
||||
channelName?: string;
|
||||
messageId?: string;
|
||||
createdAt?: Date;
|
||||
attachments?: Map<string, any>;
|
||||
reference?: { messageId?: string };
|
||||
mentionsBotId?: boolean;
|
||||
}) {
|
||||
const channelId = overrides.channelId ?? '1234567890123456';
|
||||
const authorId = overrides.authorId ?? '55512345';
|
||||
const botId = '999888777'; // matches mock client user id
|
||||
|
||||
const mentionsMap = new Map();
|
||||
if (overrides.mentionsBotId) {
|
||||
mentionsMap.set(botId, { id: botId });
|
||||
}
|
||||
|
||||
return {
|
||||
channelId,
|
||||
id: overrides.messageId ?? 'msg_001',
|
||||
content: overrides.content ?? 'Hello everyone',
|
||||
createdAt: overrides.createdAt ?? new Date('2024-01-01T00:00:00.000Z'),
|
||||
author: {
|
||||
id: authorId,
|
||||
username: overrides.authorUsername ?? 'alice',
|
||||
displayName: overrides.authorDisplayName ?? 'Alice',
|
||||
bot: overrides.isBot ?? false,
|
||||
},
|
||||
member: overrides.memberDisplayName
|
||||
? { displayName: overrides.memberDisplayName }
|
||||
: null,
|
||||
guild: overrides.guildName
|
||||
? { name: overrides.guildName }
|
||||
: null,
|
||||
channel: {
|
||||
name: overrides.channelName ?? 'general',
|
||||
messages: {
|
||||
fetch: vi.fn().mockResolvedValue({
|
||||
author: { username: 'Bob', displayName: 'Bob' },
|
||||
member: { displayName: 'Bob' },
|
||||
}),
|
||||
},
|
||||
},
|
||||
mentions: {
|
||||
users: mentionsMap,
|
||||
},
|
||||
attachments: overrides.attachments ?? new Map(),
|
||||
reference: overrides.reference ?? null,
|
||||
};
|
||||
}
|
||||
|
||||
function currentClient() {
|
||||
return clientRef.current;
|
||||
}
|
||||
|
||||
async function triggerMessage(message: any) {
|
||||
const handlers = currentClient().eventHandlers.get('messageCreate') || [];
|
||||
for (const h of handlers) await h(message);
|
||||
}
|
||||
|
||||
// --- Tests ---
|
||||
|
||||
describe('DiscordChannel', () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
vi.restoreAllMocks();
|
||||
});
|
||||
|
||||
// --- Connection lifecycle ---
|
||||
|
||||
describe('connection lifecycle', () => {
|
||||
it('resolves connect() when client is ready', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
|
||||
expect(channel.isConnected()).toBe(true);
|
||||
});
|
||||
|
||||
it('registers message handlers on connect', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
|
||||
expect(currentClient().eventHandlers.has('messageCreate')).toBe(true);
|
||||
expect(currentClient().eventHandlers.has('error')).toBe(true);
|
||||
expect(currentClient().eventHandlers.has('ready')).toBe(true);
|
||||
});
|
||||
|
||||
it('disconnects cleanly', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
expect(channel.isConnected()).toBe(true);
|
||||
|
||||
await channel.disconnect();
|
||||
expect(channel.isConnected()).toBe(false);
|
||||
});
|
||||
|
||||
it('isConnected() returns false before connect', () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
|
||||
expect(channel.isConnected()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// --- Text message handling ---
|
||||
|
||||
describe('text message handling', () => {
|
||||
it('delivers message for registered channel', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'Hello everyone',
|
||||
guildName: 'Test Server',
|
||||
channelName: 'general',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.any(String),
|
||||
'Test Server #general',
|
||||
);
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
id: 'msg_001',
|
||||
chat_jid: 'dc:1234567890123456',
|
||||
sender: '55512345',
|
||||
sender_name: 'Alice',
|
||||
content: 'Hello everyone',
|
||||
is_from_me: false,
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('only emits metadata for unregistered channels', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
channelId: '9999999999999999',
|
||||
content: 'Unknown channel',
|
||||
guildName: 'Other Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'dc:9999999999999999',
|
||||
expect.any(String),
|
||||
expect.any(String),
|
||||
);
|
||||
expect(opts.onMessage).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('ignores bot messages', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({ isBot: true, content: 'I am a bot' });
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).not.toHaveBeenCalled();
|
||||
expect(opts.onChatMetadata).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('uses member displayName when available (server nickname)', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'Hi',
|
||||
memberDisplayName: 'Alice Nickname',
|
||||
authorDisplayName: 'Alice Global',
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({ sender_name: 'Alice Nickname' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('falls back to author displayName when no member', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'Hi',
|
||||
memberDisplayName: undefined,
|
||||
authorDisplayName: 'Alice Global',
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({ sender_name: 'Alice Global' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('uses sender name for DM chats (no guild)', async () => {
|
||||
const opts = createTestOpts({
|
||||
registeredGroups: vi.fn(() => ({
|
||||
'dc:1234567890123456': {
|
||||
name: 'DM',
|
||||
folder: 'dm',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
})),
|
||||
});
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'Hello',
|
||||
guildName: undefined,
|
||||
authorDisplayName: 'Alice',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.any(String),
|
||||
'Alice',
|
||||
);
|
||||
});
|
||||
|
||||
it('uses guild name + channel name for server messages', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'Hello',
|
||||
guildName: 'My Server',
|
||||
channelName: 'bot-chat',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.any(String),
|
||||
'My Server #bot-chat',
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// --- @mention translation ---
|
||||
|
||||
describe('@mention translation', () => {
|
||||
it('translates <@botId> mention to trigger format', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: '<@999888777> what time is it?',
|
||||
mentionsBotId: true,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '@Andy what time is it?',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('does not translate if message already matches trigger', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: '@Andy hello <@999888777>',
|
||||
mentionsBotId: true,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
// Should NOT prepend @Andy — already starts with trigger
|
||||
// But the <@botId> should still be stripped
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '@Andy hello',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('does not translate when bot is not mentioned', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'hello everyone',
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: 'hello everyone',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('handles <@!botId> (nickname mention format)', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: '<@!999888777> check this',
|
||||
mentionsBotId: true,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '@Andy check this',
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// --- Attachments ---
|
||||
|
||||
describe('attachments', () => {
|
||||
it('stores image attachment with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const attachments = new Map([
|
||||
['att1', { name: 'photo.png', contentType: 'image/png' }],
|
||||
]);
|
||||
const msg = createMessage({
|
||||
content: '',
|
||||
attachments,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '[Image: photo.png]',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores video attachment with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const attachments = new Map([
|
||||
['att1', { name: 'clip.mp4', contentType: 'video/mp4' }],
|
||||
]);
|
||||
const msg = createMessage({
|
||||
content: '',
|
||||
attachments,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '[Video: clip.mp4]',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores file attachment with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const attachments = new Map([
|
||||
['att1', { name: 'report.pdf', contentType: 'application/pdf' }],
|
||||
]);
|
||||
const msg = createMessage({
|
||||
content: '',
|
||||
attachments,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '[File: report.pdf]',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('includes text content with attachments', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const attachments = new Map([
|
||||
['att1', { name: 'photo.jpg', contentType: 'image/jpeg' }],
|
||||
]);
|
||||
const msg = createMessage({
|
||||
content: 'Check this out',
|
||||
attachments,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: 'Check this out\n[Image: photo.jpg]',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('handles multiple attachments', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const attachments = new Map([
|
||||
['att1', { name: 'a.png', contentType: 'image/png' }],
|
||||
['att2', { name: 'b.txt', contentType: 'text/plain' }],
|
||||
]);
|
||||
const msg = createMessage({
|
||||
content: '',
|
||||
attachments,
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '[Image: a.png]\n[File: b.txt]',
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// --- Reply context ---
|
||||
|
||||
describe('reply context', () => {
|
||||
it('includes reply author in content', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const msg = createMessage({
|
||||
content: 'I agree with that',
|
||||
reference: { messageId: 'original_msg_id' },
|
||||
guildName: 'Server',
|
||||
});
|
||||
await triggerMessage(msg);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'dc:1234567890123456',
|
||||
expect.objectContaining({
|
||||
content: '[Reply to Bob] I agree with that',
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// --- sendMessage ---
|
||||
|
||||
describe('sendMessage', () => {
|
||||
it('sends message via channel', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.sendMessage('dc:1234567890123456', 'Hello');
|
||||
|
||||
const fetchedChannel = await currentClient().channels.fetch('1234567890123456');
|
||||
expect(currentClient().channels.fetch).toHaveBeenCalledWith('1234567890123456');
|
||||
});
|
||||
|
||||
it('strips dc: prefix from JID', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.sendMessage('dc:9876543210', 'Test');
|
||||
|
||||
expect(currentClient().channels.fetch).toHaveBeenCalledWith('9876543210');
|
||||
});
|
||||
|
||||
it('handles send failure gracefully', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
currentClient().channels.fetch.mockRejectedValueOnce(
|
||||
new Error('Channel not found'),
|
||||
);
|
||||
|
||||
// Should not throw
|
||||
await expect(
|
||||
channel.sendMessage('dc:1234567890123456', 'Will fail'),
|
||||
).resolves.toBeUndefined();
|
||||
});
|
||||
|
||||
it('does nothing when client is not initialized', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
|
||||
// Don't connect — client is null
|
||||
await channel.sendMessage('dc:1234567890123456', 'No client');
|
||||
|
||||
// No error, no API call
|
||||
});
|
||||
|
||||
it('splits messages exceeding 2000 characters', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const mockChannel = {
|
||||
send: vi.fn().mockResolvedValue(undefined),
|
||||
sendTyping: vi.fn(),
|
||||
};
|
||||
currentClient().channels.fetch.mockResolvedValue(mockChannel);
|
||||
|
||||
const longText = 'x'.repeat(3000);
|
||||
await channel.sendMessage('dc:1234567890123456', longText);
|
||||
|
||||
expect(mockChannel.send).toHaveBeenCalledTimes(2);
|
||||
expect(mockChannel.send).toHaveBeenNthCalledWith(1, 'x'.repeat(2000));
|
||||
expect(mockChannel.send).toHaveBeenNthCalledWith(2, 'x'.repeat(1000));
|
||||
});
|
||||
});
|
||||
|
||||
// --- ownsJid ---
|
||||
|
||||
describe('ownsJid', () => {
|
||||
it('owns dc: JIDs', () => {
|
||||
const channel = new DiscordChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('dc:1234567890123456')).toBe(true);
|
||||
});
|
||||
|
||||
it('does not own WhatsApp group JIDs', () => {
|
||||
const channel = new DiscordChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('12345@g.us')).toBe(false);
|
||||
});
|
||||
|
||||
it('does not own Telegram JIDs', () => {
|
||||
const channel = new DiscordChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('tg:123456789')).toBe(false);
|
||||
});
|
||||
|
||||
it('does not own unknown JID formats', () => {
|
||||
const channel = new DiscordChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('random-string')).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// --- setTyping ---
|
||||
|
||||
describe('setTyping', () => {
|
||||
it('sends typing indicator when isTyping is true', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const mockChannel = {
|
||||
send: vi.fn(),
|
||||
sendTyping: vi.fn().mockResolvedValue(undefined),
|
||||
};
|
||||
currentClient().channels.fetch.mockResolvedValue(mockChannel);
|
||||
|
||||
await channel.setTyping('dc:1234567890123456', true);
|
||||
|
||||
expect(mockChannel.sendTyping).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('does nothing when isTyping is false', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.setTyping('dc:1234567890123456', false);
|
||||
|
||||
// channels.fetch should NOT be called
|
||||
expect(currentClient().channels.fetch).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('does nothing when client is not initialized', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new DiscordChannel('test-token', opts);
|
||||
|
||||
// Don't connect
|
||||
await channel.setTyping('dc:1234567890123456', true);
|
||||
|
||||
// No error
|
||||
});
|
||||
});
|
||||
|
||||
// --- Channel properties ---
|
||||
|
||||
describe('channel properties', () => {
|
||||
it('has name "discord"', () => {
|
||||
const channel = new DiscordChannel('test-token', createTestOpts());
|
||||
expect(channel.name).toBe('discord');
|
||||
});
|
||||
});
|
||||
});
|
||||
236
.claude/skills/add-discord/add/src/channels/discord.ts
Normal file
236
.claude/skills/add-discord/add/src/channels/discord.ts
Normal file
@@ -0,0 +1,236 @@
|
||||
import { Client, Events, GatewayIntentBits, Message, TextChannel } from 'discord.js';
|
||||
|
||||
import { ASSISTANT_NAME, TRIGGER_PATTERN } from '../config.js';
|
||||
import { logger } from '../logger.js';
|
||||
import {
|
||||
Channel,
|
||||
OnChatMetadata,
|
||||
OnInboundMessage,
|
||||
RegisteredGroup,
|
||||
} from '../types.js';
|
||||
|
||||
export interface DiscordChannelOpts {
|
||||
onMessage: OnInboundMessage;
|
||||
onChatMetadata: OnChatMetadata;
|
||||
registeredGroups: () => Record<string, RegisteredGroup>;
|
||||
}
|
||||
|
||||
export class DiscordChannel implements Channel {
|
||||
name = 'discord';
|
||||
|
||||
private client: Client | null = null;
|
||||
private opts: DiscordChannelOpts;
|
||||
private botToken: string;
|
||||
|
||||
constructor(botToken: string, opts: DiscordChannelOpts) {
|
||||
this.botToken = botToken;
|
||||
this.opts = opts;
|
||||
}
|
||||
|
||||
async connect(): Promise<void> {
|
||||
this.client = new Client({
|
||||
intents: [
|
||||
GatewayIntentBits.Guilds,
|
||||
GatewayIntentBits.GuildMessages,
|
||||
GatewayIntentBits.MessageContent,
|
||||
GatewayIntentBits.DirectMessages,
|
||||
],
|
||||
});
|
||||
|
||||
this.client.on(Events.MessageCreate, async (message: Message) => {
|
||||
// Ignore bot messages (including own)
|
||||
if (message.author.bot) return;
|
||||
|
||||
const channelId = message.channelId;
|
||||
const chatJid = `dc:${channelId}`;
|
||||
let content = message.content;
|
||||
const timestamp = message.createdAt.toISOString();
|
||||
const senderName =
|
||||
message.member?.displayName ||
|
||||
message.author.displayName ||
|
||||
message.author.username;
|
||||
const sender = message.author.id;
|
||||
const msgId = message.id;
|
||||
|
||||
// Determine chat name
|
||||
let chatName: string;
|
||||
if (message.guild) {
|
||||
const textChannel = message.channel as TextChannel;
|
||||
chatName = `${message.guild.name} #${textChannel.name}`;
|
||||
} else {
|
||||
chatName = senderName;
|
||||
}
|
||||
|
||||
// Translate Discord @bot mentions into TRIGGER_PATTERN format.
|
||||
// Discord mentions look like <@botUserId> — these won't match
|
||||
// TRIGGER_PATTERN (e.g., ^@Andy\b), so we prepend the trigger
|
||||
// when the bot is @mentioned.
|
||||
if (this.client?.user) {
|
||||
const botId = this.client.user.id;
|
||||
const isBotMentioned =
|
||||
message.mentions.users.has(botId) ||
|
||||
content.includes(`<@${botId}>`) ||
|
||||
content.includes(`<@!${botId}>`);
|
||||
|
||||
if (isBotMentioned) {
|
||||
// Strip the <@botId> mention to avoid visual clutter
|
||||
content = content
|
||||
.replace(new RegExp(`<@!?${botId}>`, 'g'), '')
|
||||
.trim();
|
||||
// Prepend trigger if not already present
|
||||
if (!TRIGGER_PATTERN.test(content)) {
|
||||
content = `@${ASSISTANT_NAME} ${content}`;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle attachments — store placeholders so the agent knows something was sent
|
||||
if (message.attachments.size > 0) {
|
||||
const attachmentDescriptions = [...message.attachments.values()].map((att) => {
|
||||
const contentType = att.contentType || '';
|
||||
if (contentType.startsWith('image/')) {
|
||||
return `[Image: ${att.name || 'image'}]`;
|
||||
} else if (contentType.startsWith('video/')) {
|
||||
return `[Video: ${att.name || 'video'}]`;
|
||||
} else if (contentType.startsWith('audio/')) {
|
||||
return `[Audio: ${att.name || 'audio'}]`;
|
||||
} else {
|
||||
return `[File: ${att.name || 'file'}]`;
|
||||
}
|
||||
});
|
||||
if (content) {
|
||||
content = `${content}\n${attachmentDescriptions.join('\n')}`;
|
||||
} else {
|
||||
content = attachmentDescriptions.join('\n');
|
||||
}
|
||||
}
|
||||
|
||||
// Handle reply context — include who the user is replying to
|
||||
if (message.reference?.messageId) {
|
||||
try {
|
||||
const repliedTo = await message.channel.messages.fetch(
|
||||
message.reference.messageId,
|
||||
);
|
||||
const replyAuthor =
|
||||
repliedTo.member?.displayName ||
|
||||
repliedTo.author.displayName ||
|
||||
repliedTo.author.username;
|
||||
content = `[Reply to ${replyAuthor}] ${content}`;
|
||||
} catch {
|
||||
// Referenced message may have been deleted
|
||||
}
|
||||
}
|
||||
|
||||
// Store chat metadata for discovery
|
||||
this.opts.onChatMetadata(chatJid, timestamp, chatName);
|
||||
|
||||
// Only deliver full message for registered groups
|
||||
const group = this.opts.registeredGroups()[chatJid];
|
||||
if (!group) {
|
||||
logger.debug(
|
||||
{ chatJid, chatName },
|
||||
'Message from unregistered Discord channel',
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// Deliver message — startMessageLoop() will pick it up
|
||||
this.opts.onMessage(chatJid, {
|
||||
id: msgId,
|
||||
chat_jid: chatJid,
|
||||
sender,
|
||||
sender_name: senderName,
|
||||
content,
|
||||
timestamp,
|
||||
is_from_me: false,
|
||||
});
|
||||
|
||||
logger.info(
|
||||
{ chatJid, chatName, sender: senderName },
|
||||
'Discord message stored',
|
||||
);
|
||||
});
|
||||
|
||||
// Handle errors gracefully
|
||||
this.client.on(Events.Error, (err) => {
|
||||
logger.error({ err: err.message }, 'Discord client error');
|
||||
});
|
||||
|
||||
return new Promise<void>((resolve) => {
|
||||
this.client!.once(Events.ClientReady, (readyClient) => {
|
||||
logger.info(
|
||||
{ username: readyClient.user.tag, id: readyClient.user.id },
|
||||
'Discord bot connected',
|
||||
);
|
||||
console.log(`\n Discord bot: ${readyClient.user.tag}`);
|
||||
console.log(
|
||||
` Use /chatid command or check channel IDs in Discord settings\n`,
|
||||
);
|
||||
resolve();
|
||||
});
|
||||
|
||||
this.client!.login(this.botToken);
|
||||
});
|
||||
}
|
||||
|
||||
async sendMessage(jid: string, text: string): Promise<void> {
|
||||
if (!this.client) {
|
||||
logger.warn('Discord client not initialized');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const channelId = jid.replace(/^dc:/, '');
|
||||
const channel = await this.client.channels.fetch(channelId);
|
||||
|
||||
if (!channel || !('send' in channel)) {
|
||||
logger.warn({ jid }, 'Discord channel not found or not text-based');
|
||||
return;
|
||||
}
|
||||
|
||||
const textChannel = channel as TextChannel;
|
||||
|
||||
// Discord has a 2000 character limit per message — split if needed
|
||||
const MAX_LENGTH = 2000;
|
||||
if (text.length <= MAX_LENGTH) {
|
||||
await textChannel.send(text);
|
||||
} else {
|
||||
for (let i = 0; i < text.length; i += MAX_LENGTH) {
|
||||
await textChannel.send(text.slice(i, i + MAX_LENGTH));
|
||||
}
|
||||
}
|
||||
logger.info({ jid, length: text.length }, 'Discord message sent');
|
||||
} catch (err) {
|
||||
logger.error({ jid, err }, 'Failed to send Discord message');
|
||||
}
|
||||
}
|
||||
|
||||
isConnected(): boolean {
|
||||
return this.client !== null && this.client.isReady();
|
||||
}
|
||||
|
||||
ownsJid(jid: string): boolean {
|
||||
return jid.startsWith('dc:');
|
||||
}
|
||||
|
||||
async disconnect(): Promise<void> {
|
||||
if (this.client) {
|
||||
this.client.destroy();
|
||||
this.client = null;
|
||||
logger.info('Discord bot stopped');
|
||||
}
|
||||
}
|
||||
|
||||
async setTyping(jid: string, isTyping: boolean): Promise<void> {
|
||||
if (!this.client || !isTyping) return;
|
||||
try {
|
||||
const channelId = jid.replace(/^dc:/, '');
|
||||
const channel = await this.client.channels.fetch(channelId);
|
||||
if (channel && 'sendTyping' in channel) {
|
||||
await (channel as TextChannel).sendTyping();
|
||||
}
|
||||
} catch (err) {
|
||||
logger.debug({ jid, err }, 'Failed to send Discord typing indicator');
|
||||
}
|
||||
}
|
||||
}
|
||||
20
.claude/skills/add-discord/manifest.yaml
Normal file
20
.claude/skills/add-discord/manifest.yaml
Normal file
@@ -0,0 +1,20 @@
|
||||
skill: discord
|
||||
version: 1.0.0
|
||||
description: "Discord Bot integration via discord.js"
|
||||
core_version: 0.1.0
|
||||
adds:
|
||||
- src/channels/discord.ts
|
||||
- src/channels/discord.test.ts
|
||||
modifies:
|
||||
- src/index.ts
|
||||
- src/config.ts
|
||||
- src/routing.test.ts
|
||||
structured:
|
||||
npm_dependencies:
|
||||
discord.js: "^14.18.0"
|
||||
env_additions:
|
||||
- DISCORD_BOT_TOKEN
|
||||
- DISCORD_ONLY
|
||||
conflicts: []
|
||||
depends: []
|
||||
test: "npx vitest run src/channels/discord.test.ts"
|
||||
76
.claude/skills/add-discord/modify/src/config.ts
Normal file
76
.claude/skills/add-discord/modify/src/config.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import path from 'path';
|
||||
|
||||
import { readEnvFile } from './env.js';
|
||||
|
||||
// Read config values from .env (falls back to process.env).
|
||||
// Secrets are NOT read here — they stay on disk and are loaded only
|
||||
// where needed (container-runner.ts) to avoid leaking to child processes.
|
||||
const envConfig = readEnvFile([
|
||||
'ASSISTANT_NAME',
|
||||
'ASSISTANT_HAS_OWN_NUMBER',
|
||||
'DISCORD_BOT_TOKEN',
|
||||
'DISCORD_ONLY',
|
||||
]);
|
||||
|
||||
export const ASSISTANT_NAME =
|
||||
process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy';
|
||||
export const ASSISTANT_HAS_OWN_NUMBER =
|
||||
(process.env.ASSISTANT_HAS_OWN_NUMBER || envConfig.ASSISTANT_HAS_OWN_NUMBER) === 'true';
|
||||
export const POLL_INTERVAL = 2000;
|
||||
export const SCHEDULER_POLL_INTERVAL = 60000;
|
||||
|
||||
// Absolute paths needed for container mounts
|
||||
const PROJECT_ROOT = process.cwd();
|
||||
const HOME_DIR = process.env.HOME || '/Users/user';
|
||||
|
||||
// Mount security: allowlist stored OUTSIDE project root, never mounted into containers
|
||||
export const MOUNT_ALLOWLIST_PATH = path.join(
|
||||
HOME_DIR,
|
||||
'.config',
|
||||
'nanoclaw',
|
||||
'mount-allowlist.json',
|
||||
);
|
||||
export const STORE_DIR = path.resolve(PROJECT_ROOT, 'store');
|
||||
export const GROUPS_DIR = path.resolve(PROJECT_ROOT, 'groups');
|
||||
export const DATA_DIR = path.resolve(PROJECT_ROOT, 'data');
|
||||
export const MAIN_GROUP_FOLDER = 'main';
|
||||
|
||||
export const CONTAINER_IMAGE =
|
||||
process.env.CONTAINER_IMAGE || 'nanoclaw-agent:latest';
|
||||
export const CONTAINER_TIMEOUT = parseInt(
|
||||
process.env.CONTAINER_TIMEOUT || '1800000',
|
||||
10,
|
||||
);
|
||||
export const CONTAINER_MAX_OUTPUT_SIZE = parseInt(
|
||||
process.env.CONTAINER_MAX_OUTPUT_SIZE || '10485760',
|
||||
10,
|
||||
); // 10MB default
|
||||
export const IPC_POLL_INTERVAL = 1000;
|
||||
export const IDLE_TIMEOUT = parseInt(
|
||||
process.env.IDLE_TIMEOUT || '1800000',
|
||||
10,
|
||||
); // 30min default — how long to keep container alive after last result
|
||||
export const MAX_CONCURRENT_CONTAINERS = Math.max(
|
||||
1,
|
||||
parseInt(process.env.MAX_CONCURRENT_CONTAINERS || '5', 10) || 5,
|
||||
);
|
||||
|
||||
function escapeRegex(str: string): string {
|
||||
return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
|
||||
}
|
||||
|
||||
export const TRIGGER_PATTERN = new RegExp(
|
||||
`^@${escapeRegex(ASSISTANT_NAME)}\\b`,
|
||||
'i',
|
||||
);
|
||||
|
||||
// Timezone for scheduled tasks (cron expressions, etc.)
|
||||
// Uses system timezone by default
|
||||
export const TIMEZONE =
|
||||
process.env.TZ || Intl.DateTimeFormat().resolvedOptions().timeZone;
|
||||
|
||||
// Discord configuration
|
||||
export const DISCORD_BOT_TOKEN =
|
||||
process.env.DISCORD_BOT_TOKEN || envConfig.DISCORD_BOT_TOKEN || '';
|
||||
export const DISCORD_ONLY =
|
||||
(process.env.DISCORD_ONLY || envConfig.DISCORD_ONLY) === 'true';
|
||||
21
.claude/skills/add-discord/modify/src/config.ts.intent.md
Normal file
21
.claude/skills/add-discord/modify/src/config.ts.intent.md
Normal file
@@ -0,0 +1,21 @@
|
||||
# Intent: src/config.ts modifications
|
||||
|
||||
## What changed
|
||||
Added two new configuration exports for Discord channel support.
|
||||
|
||||
## Key sections
|
||||
- **readEnvFile call**: Must include `DISCORD_BOT_TOKEN` and `DISCORD_ONLY` in the keys array. NanoClaw does NOT load `.env` into `process.env` — all `.env` values must be explicitly requested via `readEnvFile()`.
|
||||
- **DISCORD_BOT_TOKEN**: Read from `process.env` first, then `envConfig` fallback, defaults to empty string (channel disabled when empty)
|
||||
- **DISCORD_ONLY**: Boolean flag from `process.env` or `envConfig`, when `true` disables WhatsApp channel creation
|
||||
|
||||
## Invariants
|
||||
- All existing config exports remain unchanged
|
||||
- New Discord keys are added to the `readEnvFile` call alongside existing keys
|
||||
- New exports are appended at the end of the file
|
||||
- No existing behavior is modified — Discord config is additive only
|
||||
- Both `process.env` and `envConfig` are checked (same pattern as `ASSISTANT_NAME`)
|
||||
|
||||
## Must-keep
|
||||
- All existing exports (`ASSISTANT_NAME`, `POLL_INTERVAL`, `TRIGGER_PATTERN`, etc.)
|
||||
- The `readEnvFile` pattern — ALL config read from `.env` must go through this function
|
||||
- The `escapeRegex` helper and `TRIGGER_PATTERN` construction
|
||||
537
.claude/skills/add-discord/modify/src/index.ts
Normal file
537
.claude/skills/add-discord/modify/src/index.ts
Normal file
@@ -0,0 +1,537 @@
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import {
|
||||
ASSISTANT_NAME,
|
||||
DATA_DIR,
|
||||
DISCORD_BOT_TOKEN,
|
||||
DISCORD_ONLY,
|
||||
IDLE_TIMEOUT,
|
||||
MAIN_GROUP_FOLDER,
|
||||
POLL_INTERVAL,
|
||||
TRIGGER_PATTERN,
|
||||
} from './config.js';
|
||||
import { DiscordChannel } from './channels/discord.js';
|
||||
import { WhatsAppChannel } from './channels/whatsapp.js';
|
||||
import {
|
||||
ContainerOutput,
|
||||
runContainerAgent,
|
||||
writeGroupsSnapshot,
|
||||
writeTasksSnapshot,
|
||||
} from './container-runner.js';
|
||||
import {
|
||||
getAllChats,
|
||||
getAllRegisteredGroups,
|
||||
getAllSessions,
|
||||
getAllTasks,
|
||||
getMessagesSince,
|
||||
getNewMessages,
|
||||
getRouterState,
|
||||
initDatabase,
|
||||
setRegisteredGroup,
|
||||
setRouterState,
|
||||
setSession,
|
||||
storeChatMetadata,
|
||||
storeMessage,
|
||||
} from './db.js';
|
||||
import { GroupQueue } from './group-queue.js';
|
||||
import { startIpcWatcher } from './ipc.js';
|
||||
import { findChannel, formatMessages, formatOutbound } from './router.js';
|
||||
import { startSchedulerLoop } from './task-scheduler.js';
|
||||
import { Channel, NewMessage, RegisteredGroup } from './types.js';
|
||||
import { logger } from './logger.js';
|
||||
|
||||
// Re-export for backwards compatibility during refactor
|
||||
export { escapeXml, formatMessages } from './router.js';
|
||||
|
||||
let lastTimestamp = '';
|
||||
let sessions: Record<string, string> = {};
|
||||
let registeredGroups: Record<string, RegisteredGroup> = {};
|
||||
let lastAgentTimestamp: Record<string, string> = {};
|
||||
let messageLoopRunning = false;
|
||||
|
||||
let whatsapp: WhatsAppChannel;
|
||||
const channels: Channel[] = [];
|
||||
const queue = new GroupQueue();
|
||||
|
||||
function loadState(): void {
|
||||
lastTimestamp = getRouterState('last_timestamp') || '';
|
||||
const agentTs = getRouterState('last_agent_timestamp');
|
||||
try {
|
||||
lastAgentTimestamp = agentTs ? JSON.parse(agentTs) : {};
|
||||
} catch {
|
||||
logger.warn('Corrupted last_agent_timestamp in DB, resetting');
|
||||
lastAgentTimestamp = {};
|
||||
}
|
||||
sessions = getAllSessions();
|
||||
registeredGroups = getAllRegisteredGroups();
|
||||
logger.info(
|
||||
{ groupCount: Object.keys(registeredGroups).length },
|
||||
'State loaded',
|
||||
);
|
||||
}
|
||||
|
||||
function saveState(): void {
|
||||
setRouterState('last_timestamp', lastTimestamp);
|
||||
setRouterState(
|
||||
'last_agent_timestamp',
|
||||
JSON.stringify(lastAgentTimestamp),
|
||||
);
|
||||
}
|
||||
|
||||
function registerGroup(jid: string, group: RegisteredGroup): void {
|
||||
registeredGroups[jid] = group;
|
||||
setRegisteredGroup(jid, group);
|
||||
|
||||
// Create group folder
|
||||
const groupDir = path.join(DATA_DIR, '..', 'groups', group.folder);
|
||||
fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true });
|
||||
|
||||
logger.info(
|
||||
{ jid, name: group.name, folder: group.folder },
|
||||
'Group registered',
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get available groups list for the agent.
|
||||
* Returns groups ordered by most recent activity.
|
||||
*/
|
||||
export function getAvailableGroups(): import('./container-runner.js').AvailableGroup[] {
|
||||
const chats = getAllChats();
|
||||
const registeredJids = new Set(Object.keys(registeredGroups));
|
||||
|
||||
return chats
|
||||
.filter((c) => c.jid !== '__group_sync__' && c.is_group)
|
||||
.map((c) => ({
|
||||
jid: c.jid,
|
||||
name: c.name,
|
||||
lastActivity: c.last_message_time,
|
||||
isRegistered: registeredJids.has(c.jid),
|
||||
}));
|
||||
}
|
||||
|
||||
/** @internal - exported for testing */
|
||||
export function _setRegisteredGroups(groups: Record<string, RegisteredGroup>): void {
|
||||
registeredGroups = groups;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process all pending messages for a group.
|
||||
* Called by the GroupQueue when it's this group's turn.
|
||||
*/
|
||||
async function processGroupMessages(chatJid: string): Promise<boolean> {
|
||||
const group = registeredGroups[chatJid];
|
||||
if (!group) return true;
|
||||
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) return true;
|
||||
|
||||
const isMainGroup = group.folder === MAIN_GROUP_FOLDER;
|
||||
|
||||
const sinceTimestamp = lastAgentTimestamp[chatJid] || '';
|
||||
const missedMessages = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME);
|
||||
|
||||
if (missedMessages.length === 0) return true;
|
||||
|
||||
// For non-main groups, check if trigger is required and present
|
||||
if (!isMainGroup && group.requiresTrigger !== false) {
|
||||
const hasTrigger = missedMessages.some((m) =>
|
||||
TRIGGER_PATTERN.test(m.content.trim()),
|
||||
);
|
||||
if (!hasTrigger) return true;
|
||||
}
|
||||
|
||||
const prompt = formatMessages(missedMessages);
|
||||
|
||||
// Advance cursor so the piping path in startMessageLoop won't re-fetch
|
||||
// these messages. Save the old cursor so we can roll back on error.
|
||||
const previousCursor = lastAgentTimestamp[chatJid] || '';
|
||||
lastAgentTimestamp[chatJid] =
|
||||
missedMessages[missedMessages.length - 1].timestamp;
|
||||
saveState();
|
||||
|
||||
logger.info(
|
||||
{ group: group.name, messageCount: missedMessages.length },
|
||||
'Processing messages',
|
||||
);
|
||||
|
||||
// Track idle timer for closing stdin when agent is idle
|
||||
let idleTimer: ReturnType<typeof setTimeout> | null = null;
|
||||
|
||||
const resetIdleTimer = () => {
|
||||
if (idleTimer) clearTimeout(idleTimer);
|
||||
idleTimer = setTimeout(() => {
|
||||
logger.debug({ group: group.name }, 'Idle timeout, closing container stdin');
|
||||
queue.closeStdin(chatJid);
|
||||
}, IDLE_TIMEOUT);
|
||||
};
|
||||
|
||||
await channel.setTyping?.(chatJid, true);
|
||||
let hadError = false;
|
||||
let outputSentToUser = false;
|
||||
|
||||
const output = await runAgent(group, prompt, chatJid, async (result) => {
|
||||
// Streaming output callback — called for each agent result
|
||||
if (result.result) {
|
||||
const raw = typeof result.result === 'string' ? result.result : JSON.stringify(result.result);
|
||||
// Strip <internal>...</internal> blocks — agent uses these for internal reasoning
|
||||
const text = raw.replace(/<internal>[\s\S]*?<\/internal>/g, '').trim();
|
||||
logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`);
|
||||
if (text) {
|
||||
await channel.sendMessage(chatJid, text);
|
||||
outputSentToUser = true;
|
||||
}
|
||||
// Only reset idle timer on actual results, not session-update markers (result: null)
|
||||
resetIdleTimer();
|
||||
}
|
||||
|
||||
if (result.status === 'error') {
|
||||
hadError = true;
|
||||
}
|
||||
});
|
||||
|
||||
await channel.setTyping?.(chatJid, false);
|
||||
if (idleTimer) clearTimeout(idleTimer);
|
||||
|
||||
if (output === 'error' || hadError) {
|
||||
// If we already sent output to the user, don't roll back the cursor —
|
||||
// the user got their response and re-processing would send duplicates.
|
||||
if (outputSentToUser) {
|
||||
logger.warn({ group: group.name }, 'Agent error after output was sent, skipping cursor rollback to prevent duplicates');
|
||||
return true;
|
||||
}
|
||||
// Roll back cursor so retries can re-process these messages
|
||||
lastAgentTimestamp[chatJid] = previousCursor;
|
||||
saveState();
|
||||
logger.warn({ group: group.name }, 'Agent error, rolled back message cursor for retry');
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async function runAgent(
|
||||
group: RegisteredGroup,
|
||||
prompt: string,
|
||||
chatJid: string,
|
||||
onOutput?: (output: ContainerOutput) => Promise<void>,
|
||||
): Promise<'success' | 'error'> {
|
||||
const isMain = group.folder === MAIN_GROUP_FOLDER;
|
||||
const sessionId = sessions[group.folder];
|
||||
|
||||
// Update tasks snapshot for container to read (filtered by group)
|
||||
const tasks = getAllTasks();
|
||||
writeTasksSnapshot(
|
||||
group.folder,
|
||||
isMain,
|
||||
tasks.map((t) => ({
|
||||
id: t.id,
|
||||
groupFolder: t.group_folder,
|
||||
prompt: t.prompt,
|
||||
schedule_type: t.schedule_type,
|
||||
schedule_value: t.schedule_value,
|
||||
status: t.status,
|
||||
next_run: t.next_run,
|
||||
})),
|
||||
);
|
||||
|
||||
// Update available groups snapshot (main group only can see all groups)
|
||||
const availableGroups = getAvailableGroups();
|
||||
writeGroupsSnapshot(
|
||||
group.folder,
|
||||
isMain,
|
||||
availableGroups,
|
||||
new Set(Object.keys(registeredGroups)),
|
||||
);
|
||||
|
||||
// Wrap onOutput to track session ID from streamed results
|
||||
const wrappedOnOutput = onOutput
|
||||
? async (output: ContainerOutput) => {
|
||||
if (output.newSessionId) {
|
||||
sessions[group.folder] = output.newSessionId;
|
||||
setSession(group.folder, output.newSessionId);
|
||||
}
|
||||
await onOutput(output);
|
||||
}
|
||||
: undefined;
|
||||
|
||||
try {
|
||||
const output = await runContainerAgent(
|
||||
group,
|
||||
{
|
||||
prompt,
|
||||
sessionId,
|
||||
groupFolder: group.folder,
|
||||
chatJid,
|
||||
isMain,
|
||||
},
|
||||
(proc, containerName) => queue.registerProcess(chatJid, proc, containerName, group.folder),
|
||||
wrappedOnOutput,
|
||||
);
|
||||
|
||||
if (output.newSessionId) {
|
||||
sessions[group.folder] = output.newSessionId;
|
||||
setSession(group.folder, output.newSessionId);
|
||||
}
|
||||
|
||||
if (output.status === 'error') {
|
||||
logger.error(
|
||||
{ group: group.name, error: output.error },
|
||||
'Container agent error',
|
||||
);
|
||||
return 'error';
|
||||
}
|
||||
|
||||
return 'success';
|
||||
} catch (err) {
|
||||
logger.error({ group: group.name, err }, 'Agent error');
|
||||
return 'error';
|
||||
}
|
||||
}
|
||||
|
||||
async function startMessageLoop(): Promise<void> {
|
||||
if (messageLoopRunning) {
|
||||
logger.debug('Message loop already running, skipping duplicate start');
|
||||
return;
|
||||
}
|
||||
messageLoopRunning = true;
|
||||
|
||||
logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`);
|
||||
|
||||
while (true) {
|
||||
try {
|
||||
const jids = Object.keys(registeredGroups);
|
||||
const { messages, newTimestamp } = getNewMessages(jids, lastTimestamp, ASSISTANT_NAME);
|
||||
|
||||
if (messages.length > 0) {
|
||||
logger.info({ count: messages.length }, 'New messages');
|
||||
|
||||
// Advance the "seen" cursor for all messages immediately
|
||||
lastTimestamp = newTimestamp;
|
||||
saveState();
|
||||
|
||||
// Deduplicate by group
|
||||
const messagesByGroup = new Map<string, NewMessage[]>();
|
||||
for (const msg of messages) {
|
||||
const existing = messagesByGroup.get(msg.chat_jid);
|
||||
if (existing) {
|
||||
existing.push(msg);
|
||||
} else {
|
||||
messagesByGroup.set(msg.chat_jid, [msg]);
|
||||
}
|
||||
}
|
||||
|
||||
for (const [chatJid, groupMessages] of messagesByGroup) {
|
||||
const group = registeredGroups[chatJid];
|
||||
if (!group) continue;
|
||||
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) continue;
|
||||
|
||||
const isMainGroup = group.folder === MAIN_GROUP_FOLDER;
|
||||
const needsTrigger = !isMainGroup && group.requiresTrigger !== false;
|
||||
|
||||
// For non-main groups, only act on trigger messages.
|
||||
// Non-trigger messages accumulate in DB and get pulled as
|
||||
// context when a trigger eventually arrives.
|
||||
if (needsTrigger) {
|
||||
const hasTrigger = groupMessages.some((m) =>
|
||||
TRIGGER_PATTERN.test(m.content.trim()),
|
||||
);
|
||||
if (!hasTrigger) continue;
|
||||
}
|
||||
|
||||
// Pull all messages since lastAgentTimestamp so non-trigger
|
||||
// context that accumulated between triggers is included.
|
||||
const allPending = getMessagesSince(
|
||||
chatJid,
|
||||
lastAgentTimestamp[chatJid] || '',
|
||||
ASSISTANT_NAME,
|
||||
);
|
||||
const messagesToSend =
|
||||
allPending.length > 0 ? allPending : groupMessages;
|
||||
const formatted = formatMessages(messagesToSend);
|
||||
|
||||
if (queue.sendMessage(chatJid, formatted)) {
|
||||
logger.debug(
|
||||
{ chatJid, count: messagesToSend.length },
|
||||
'Piped messages to active container',
|
||||
);
|
||||
lastAgentTimestamp[chatJid] =
|
||||
messagesToSend[messagesToSend.length - 1].timestamp;
|
||||
saveState();
|
||||
// Show typing indicator while the container processes the piped message
|
||||
channel.setTyping?.(chatJid, true);
|
||||
} else {
|
||||
// No active container — enqueue for a new one
|
||||
queue.enqueueMessageCheck(chatJid);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
logger.error({ err }, 'Error in message loop');
|
||||
}
|
||||
await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Startup recovery: check for unprocessed messages in registered groups.
|
||||
* Handles crash between advancing lastTimestamp and processing messages.
|
||||
*/
|
||||
function recoverPendingMessages(): void {
|
||||
for (const [chatJid, group] of Object.entries(registeredGroups)) {
|
||||
const sinceTimestamp = lastAgentTimestamp[chatJid] || '';
|
||||
const pending = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME);
|
||||
if (pending.length > 0) {
|
||||
logger.info(
|
||||
{ group: group.name, pendingCount: pending.length },
|
||||
'Recovery: found unprocessed messages',
|
||||
);
|
||||
queue.enqueueMessageCheck(chatJid);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function ensureContainerSystemRunning(): void {
|
||||
try {
|
||||
execSync('container system status', { stdio: 'pipe' });
|
||||
logger.debug('Apple Container system already running');
|
||||
} catch {
|
||||
logger.info('Starting Apple Container system...');
|
||||
try {
|
||||
execSync('container system start', { stdio: 'pipe', timeout: 30000 });
|
||||
logger.info('Apple Container system started');
|
||||
} catch (err) {
|
||||
logger.error({ err }, 'Failed to start Apple Container system');
|
||||
console.error(
|
||||
'\n╔════════════════════════════════════════════════════════════════╗',
|
||||
);
|
||||
console.error(
|
||||
'║ FATAL: Apple Container system failed to start ║',
|
||||
);
|
||||
console.error(
|
||||
'║ ║',
|
||||
);
|
||||
console.error(
|
||||
'║ Agents cannot run without Apple Container. To fix: ║',
|
||||
);
|
||||
console.error(
|
||||
'║ 1. Install from: https://github.com/apple/container/releases ║',
|
||||
);
|
||||
console.error(
|
||||
'║ 2. Run: container system start ║',
|
||||
);
|
||||
console.error(
|
||||
'║ 3. Restart NanoClaw ║',
|
||||
);
|
||||
console.error(
|
||||
'╚════════════════════════════════════════════════════════════════╝\n',
|
||||
);
|
||||
throw new Error('Apple Container system is required but failed to start');
|
||||
}
|
||||
}
|
||||
|
||||
// Kill and clean up orphaned NanoClaw containers from previous runs
|
||||
try {
|
||||
const output = execSync('container ls --format json', {
|
||||
stdio: ['pipe', 'pipe', 'pipe'],
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
const containers: { status: string; configuration: { id: string } }[] = JSON.parse(output || '[]');
|
||||
const orphans = containers
|
||||
.filter((c) => c.status === 'running' && c.configuration.id.startsWith('nanoclaw-'))
|
||||
.map((c) => c.configuration.id);
|
||||
for (const name of orphans) {
|
||||
try {
|
||||
execSync(`container stop ${name}`, { stdio: 'pipe' });
|
||||
} catch { /* already stopped */ }
|
||||
}
|
||||
if (orphans.length > 0) {
|
||||
logger.info({ count: orphans.length, names: orphans }, 'Stopped orphaned containers');
|
||||
}
|
||||
} catch (err) {
|
||||
logger.warn({ err }, 'Failed to clean up orphaned containers');
|
||||
}
|
||||
}
|
||||
|
||||
async function main(): Promise<void> {
|
||||
ensureContainerSystemRunning();
|
||||
initDatabase();
|
||||
logger.info('Database initialized');
|
||||
loadState();
|
||||
|
||||
// Graceful shutdown handlers
|
||||
const shutdown = async (signal: string) => {
|
||||
logger.info({ signal }, 'Shutdown signal received');
|
||||
await queue.shutdown(10000);
|
||||
for (const ch of channels) await ch.disconnect();
|
||||
process.exit(0);
|
||||
};
|
||||
process.on('SIGTERM', () => shutdown('SIGTERM'));
|
||||
process.on('SIGINT', () => shutdown('SIGINT'));
|
||||
|
||||
// Channel callbacks (shared by all channels)
|
||||
const channelOpts = {
|
||||
onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg),
|
||||
onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) =>
|
||||
storeChatMetadata(chatJid, timestamp, name, channel, isGroup),
|
||||
registeredGroups: () => registeredGroups,
|
||||
};
|
||||
|
||||
// Create and connect channels
|
||||
if (DISCORD_BOT_TOKEN) {
|
||||
const discord = new DiscordChannel(DISCORD_BOT_TOKEN, channelOpts);
|
||||
channels.push(discord);
|
||||
await discord.connect();
|
||||
}
|
||||
|
||||
if (!DISCORD_ONLY) {
|
||||
whatsapp = new WhatsAppChannel(channelOpts);
|
||||
channels.push(whatsapp);
|
||||
await whatsapp.connect();
|
||||
}
|
||||
|
||||
// Start subsystems (independently of connection handler)
|
||||
startSchedulerLoop({
|
||||
registeredGroups: () => registeredGroups,
|
||||
getSessions: () => sessions,
|
||||
queue,
|
||||
onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder),
|
||||
sendMessage: async (jid, rawText) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) return;
|
||||
const text = formatOutbound(rawText);
|
||||
if (text) await channel.sendMessage(jid, text);
|
||||
},
|
||||
});
|
||||
startIpcWatcher({
|
||||
sendMessage: (jid, text) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) throw new Error(`No channel for JID: ${jid}`);
|
||||
return channel.sendMessage(jid, text);
|
||||
},
|
||||
registeredGroups: () => registeredGroups,
|
||||
registerGroup,
|
||||
syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(),
|
||||
getAvailableGroups,
|
||||
writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj),
|
||||
});
|
||||
queue.setProcessMessagesFn(processGroupMessages);
|
||||
recoverPendingMessages();
|
||||
startMessageLoop();
|
||||
}
|
||||
|
||||
// Guard: only run when executed directly, not when imported by tests
|
||||
const isDirectRun =
|
||||
process.argv[1] &&
|
||||
new URL(import.meta.url).pathname === new URL(`file://${process.argv[1]}`).pathname;
|
||||
|
||||
if (isDirectRun) {
|
||||
main().catch((err) => {
|
||||
logger.error({ err }, 'Failed to start NanoClaw');
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
43
.claude/skills/add-discord/modify/src/index.ts.intent.md
Normal file
43
.claude/skills/add-discord/modify/src/index.ts.intent.md
Normal file
@@ -0,0 +1,43 @@
|
||||
# Intent: src/index.ts modifications
|
||||
|
||||
## What changed
|
||||
Added Discord as a channel option alongside WhatsApp, introducing multi-channel infrastructure.
|
||||
|
||||
## Key sections
|
||||
|
||||
### Imports (top of file)
|
||||
- Added: `DiscordChannel` from `./channels/discord.js`
|
||||
- Added: `DISCORD_BOT_TOKEN`, `DISCORD_ONLY` from `./config.js`
|
||||
- Added: `findChannel` from `./router.js`
|
||||
- Added: `Channel` from `./types.js`
|
||||
|
||||
### Multi-channel infrastructure
|
||||
- Added: `const channels: Channel[] = []` array to hold all active channels
|
||||
- Changed: `processGroupMessages` uses `findChannel(channels, chatJid)` instead of `whatsapp` directly
|
||||
- Changed: `startMessageLoop` uses `findChannel(channels, chatJid)` instead of `whatsapp` directly
|
||||
- Changed: `channel.setTyping?.()` instead of `whatsapp.setTyping()`
|
||||
- Changed: `channel.sendMessage()` instead of `whatsapp.sendMessage()`
|
||||
|
||||
### getAvailableGroups()
|
||||
- Unchanged: uses `c.is_group` filter from base (Discord channels pass `isGroup=true` via `onChatMetadata`)
|
||||
|
||||
### main()
|
||||
- Added: `channelOpts` shared callback object for all channels
|
||||
- Changed: WhatsApp conditional to `if (!DISCORD_ONLY)`
|
||||
- Added: conditional Discord creation (`if (DISCORD_BOT_TOKEN)`)
|
||||
- Changed: shutdown iterates `channels` array instead of just `whatsapp`
|
||||
- Changed: subsystems use `findChannel(channels, jid)` for message routing
|
||||
|
||||
## Invariants
|
||||
- All existing message processing logic (triggers, cursors, idle timers) is preserved
|
||||
- The `runAgent` function is completely unchanged
|
||||
- State management (loadState/saveState) is unchanged
|
||||
- Recovery logic is unchanged
|
||||
- Apple Container check is unchanged (ensureContainerSystemRunning)
|
||||
|
||||
## Must-keep
|
||||
- The `escapeXml` and `formatMessages` re-exports
|
||||
- The `_setRegisteredGroups` test helper
|
||||
- The `isDirectRun` guard at bottom
|
||||
- All error handling and cursor rollback logic in processGroupMessages
|
||||
- The outgoing queue flush and reconnection logic (in WhatsAppChannel, not here)
|
||||
147
.claude/skills/add-discord/modify/src/routing.test.ts
Normal file
147
.claude/skills/add-discord/modify/src/routing.test.ts
Normal file
@@ -0,0 +1,147 @@
|
||||
import { describe, it, expect, beforeEach } from 'vitest';
|
||||
|
||||
import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js';
|
||||
import { getAvailableGroups, _setRegisteredGroups } from './index.js';
|
||||
|
||||
beforeEach(() => {
|
||||
_initTestDatabase();
|
||||
_setRegisteredGroups({});
|
||||
});
|
||||
|
||||
// --- JID ownership patterns ---
|
||||
|
||||
describe('JID ownership patterns', () => {
|
||||
// These test the patterns that will become ownsJid() on the Channel interface
|
||||
|
||||
it('WhatsApp group JID: ends with @g.us', () => {
|
||||
const jid = '12345678@g.us';
|
||||
expect(jid.endsWith('@g.us')).toBe(true);
|
||||
});
|
||||
|
||||
it('Discord JID: starts with dc:', () => {
|
||||
const jid = 'dc:1234567890123456';
|
||||
expect(jid.startsWith('dc:')).toBe(true);
|
||||
});
|
||||
|
||||
it('WhatsApp DM JID: ends with @s.whatsapp.net', () => {
|
||||
const jid = '12345678@s.whatsapp.net';
|
||||
expect(jid.endsWith('@s.whatsapp.net')).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
// --- getAvailableGroups ---
|
||||
|
||||
describe('getAvailableGroups', () => {
|
||||
it('returns only groups, excludes DMs', () => {
|
||||
storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true);
|
||||
storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false);
|
||||
storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(2);
|
||||
expect(groups.map((g) => g.jid)).toContain('group1@g.us');
|
||||
expect(groups.map((g) => g.jid)).toContain('group2@g.us');
|
||||
expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net');
|
||||
});
|
||||
|
||||
it('includes Discord channel JIDs', () => {
|
||||
storeChatMetadata('dc:1234567890123456', '2024-01-01T00:00:01.000Z', 'Discord Channel', 'discord', true);
|
||||
storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('dc:1234567890123456');
|
||||
});
|
||||
|
||||
it('marks registered Discord channels correctly', () => {
|
||||
storeChatMetadata('dc:1234567890123456', '2024-01-01T00:00:01.000Z', 'DC Registered', 'discord', true);
|
||||
storeChatMetadata('dc:9999999999999999', '2024-01-01T00:00:02.000Z', 'DC Unregistered', 'discord', true);
|
||||
|
||||
_setRegisteredGroups({
|
||||
'dc:1234567890123456': {
|
||||
name: 'DC Registered',
|
||||
folder: 'dc-registered',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
});
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
const dcReg = groups.find((g) => g.jid === 'dc:1234567890123456');
|
||||
const dcUnreg = groups.find((g) => g.jid === 'dc:9999999999999999');
|
||||
|
||||
expect(dcReg?.isRegistered).toBe(true);
|
||||
expect(dcUnreg?.isRegistered).toBe(false);
|
||||
});
|
||||
|
||||
it('excludes __group_sync__ sentinel', () => {
|
||||
storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z');
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('group@g.us');
|
||||
});
|
||||
|
||||
it('marks registered groups correctly', () => {
|
||||
storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true);
|
||||
storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true);
|
||||
|
||||
_setRegisteredGroups({
|
||||
'reg@g.us': {
|
||||
name: 'Registered',
|
||||
folder: 'registered',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
});
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
const reg = groups.find((g) => g.jid === 'reg@g.us');
|
||||
const unreg = groups.find((g) => g.jid === 'unreg@g.us');
|
||||
|
||||
expect(reg?.isRegistered).toBe(true);
|
||||
expect(unreg?.isRegistered).toBe(false);
|
||||
});
|
||||
|
||||
it('returns groups ordered by most recent activity', () => {
|
||||
storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true);
|
||||
storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true);
|
||||
storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups[0].jid).toBe('new@g.us');
|
||||
expect(groups[1].jid).toBe('mid@g.us');
|
||||
expect(groups[2].jid).toBe('old@g.us');
|
||||
});
|
||||
|
||||
it('excludes non-group chats regardless of JID format', () => {
|
||||
// Unknown JID format stored without is_group should not appear
|
||||
storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown');
|
||||
// Explicitly non-group with unusual JID
|
||||
storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false);
|
||||
// A real group for contrast
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('group@g.us');
|
||||
});
|
||||
|
||||
it('returns empty array when no chats exist', () => {
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('mixes WhatsApp and Discord chats ordered by activity', () => {
|
||||
storeChatMetadata('wa@g.us', '2024-01-01T00:00:01.000Z', 'WhatsApp', 'whatsapp', true);
|
||||
storeChatMetadata('dc:555', '2024-01-01T00:00:03.000Z', 'Discord', 'discord', true);
|
||||
storeChatMetadata('wa2@g.us', '2024-01-01T00:00:02.000Z', 'WhatsApp 2', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(3);
|
||||
expect(groups[0].jid).toBe('dc:555');
|
||||
expect(groups[1].jid).toBe('wa2@g.us');
|
||||
expect(groups[2].jid).toBe('wa@g.us');
|
||||
});
|
||||
});
|
||||
133
.claude/skills/add-discord/tests/discord.test.ts
Normal file
133
.claude/skills/add-discord/tests/discord.test.ts
Normal file
@@ -0,0 +1,133 @@
|
||||
import { describe, expect, it } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
describe('discord skill package', () => {
|
||||
const skillDir = path.resolve(__dirname, '..');
|
||||
|
||||
it('has a valid manifest', () => {
|
||||
const manifestPath = path.join(skillDir, 'manifest.yaml');
|
||||
expect(fs.existsSync(manifestPath)).toBe(true);
|
||||
|
||||
const content = fs.readFileSync(manifestPath, 'utf-8');
|
||||
expect(content).toContain('skill: discord');
|
||||
expect(content).toContain('version: 1.0.0');
|
||||
expect(content).toContain('discord.js');
|
||||
});
|
||||
|
||||
it('has all files declared in adds', () => {
|
||||
const addFile = path.join(skillDir, 'add', 'src', 'channels', 'discord.ts');
|
||||
expect(fs.existsSync(addFile)).toBe(true);
|
||||
|
||||
const content = fs.readFileSync(addFile, 'utf-8');
|
||||
expect(content).toContain('class DiscordChannel');
|
||||
expect(content).toContain('implements Channel');
|
||||
|
||||
// Test file for the channel
|
||||
const testFile = path.join(skillDir, 'add', 'src', 'channels', 'discord.test.ts');
|
||||
expect(fs.existsSync(testFile)).toBe(true);
|
||||
|
||||
const testContent = fs.readFileSync(testFile, 'utf-8');
|
||||
expect(testContent).toContain("describe('DiscordChannel'");
|
||||
});
|
||||
|
||||
it('has all files declared in modifies', () => {
|
||||
const indexFile = path.join(skillDir, 'modify', 'src', 'index.ts');
|
||||
const configFile = path.join(skillDir, 'modify', 'src', 'config.ts');
|
||||
const routingTestFile = path.join(skillDir, 'modify', 'src', 'routing.test.ts');
|
||||
|
||||
expect(fs.existsSync(indexFile)).toBe(true);
|
||||
expect(fs.existsSync(configFile)).toBe(true);
|
||||
expect(fs.existsSync(routingTestFile)).toBe(true);
|
||||
|
||||
const indexContent = fs.readFileSync(indexFile, 'utf-8');
|
||||
expect(indexContent).toContain('DiscordChannel');
|
||||
expect(indexContent).toContain('DISCORD_BOT_TOKEN');
|
||||
expect(indexContent).toContain('DISCORD_ONLY');
|
||||
expect(indexContent).toContain('findChannel');
|
||||
expect(indexContent).toContain('channels: Channel[]');
|
||||
|
||||
const configContent = fs.readFileSync(configFile, 'utf-8');
|
||||
expect(configContent).toContain('DISCORD_BOT_TOKEN');
|
||||
expect(configContent).toContain('DISCORD_ONLY');
|
||||
});
|
||||
|
||||
it('has intent files for modified files', () => {
|
||||
expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'index.ts.intent.md'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'config.ts.intent.md'))).toBe(true);
|
||||
});
|
||||
|
||||
it('modified index.ts preserves core structure', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// Core functions still present
|
||||
expect(content).toContain('function loadState()');
|
||||
expect(content).toContain('function saveState()');
|
||||
expect(content).toContain('function registerGroup(');
|
||||
expect(content).toContain('function getAvailableGroups()');
|
||||
expect(content).toContain('function processGroupMessages(');
|
||||
expect(content).toContain('function runAgent(');
|
||||
expect(content).toContain('function startMessageLoop()');
|
||||
expect(content).toContain('function recoverPendingMessages()');
|
||||
expect(content).toContain('function ensureContainerSystemRunning()');
|
||||
expect(content).toContain('async function main()');
|
||||
|
||||
// Test helper preserved
|
||||
expect(content).toContain('_setRegisteredGroups');
|
||||
|
||||
// Direct-run guard preserved
|
||||
expect(content).toContain('isDirectRun');
|
||||
});
|
||||
|
||||
it('modified index.ts includes Discord channel creation', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// Multi-channel architecture
|
||||
expect(content).toContain('const channels: Channel[] = []');
|
||||
expect(content).toContain('channels.push(whatsapp)');
|
||||
expect(content).toContain('channels.push(discord)');
|
||||
|
||||
// Conditional channel creation
|
||||
expect(content).toContain('if (!DISCORD_ONLY)');
|
||||
expect(content).toContain('if (DISCORD_BOT_TOKEN)');
|
||||
|
||||
// Shutdown disconnects all channels
|
||||
expect(content).toContain('for (const ch of channels) await ch.disconnect()');
|
||||
});
|
||||
|
||||
it('modified config.ts preserves all existing exports', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'config.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// All original exports preserved
|
||||
expect(content).toContain('export const ASSISTANT_NAME');
|
||||
expect(content).toContain('export const POLL_INTERVAL');
|
||||
expect(content).toContain('export const TRIGGER_PATTERN');
|
||||
expect(content).toContain('export const CONTAINER_IMAGE');
|
||||
expect(content).toContain('export const DATA_DIR');
|
||||
expect(content).toContain('export const TIMEZONE');
|
||||
|
||||
// Discord exports added
|
||||
expect(content).toContain('export const DISCORD_BOT_TOKEN');
|
||||
expect(content).toContain('export const DISCORD_ONLY');
|
||||
});
|
||||
|
||||
it('modified routing.test.ts includes Discord JID tests', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'routing.test.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
expect(content).toContain("Discord JID: starts with dc:");
|
||||
expect(content).toContain("dc:1234567890123456");
|
||||
expect(content).toContain("dc:");
|
||||
});
|
||||
});
|
||||
@@ -5,26 +5,70 @@ description: Add Telegram as a channel. Can replace WhatsApp entirely or run alo
|
||||
|
||||
# Add Telegram Channel
|
||||
|
||||
This skill adds Telegram support to NanoClaw. Users can choose to:
|
||||
This skill adds Telegram support to NanoClaw using the skills engine for deterministic code changes, then walks through interactive setup.
|
||||
|
||||
1. **Replace WhatsApp** - Use Telegram as the only messaging channel
|
||||
2. **Add alongside WhatsApp** - Both channels active
|
||||
3. **Control channel** - Telegram triggers agent but doesn't receive all outputs
|
||||
4. **Notification channel** - Receives outputs but limited triggering
|
||||
## Phase 1: Pre-flight
|
||||
|
||||
## Prerequisites
|
||||
### Check if already applied
|
||||
|
||||
### 1. Install Grammy
|
||||
Read `.nanoclaw/state.yaml`. If `telegram` is in `applied_skills`, skip to Phase 3 (Setup). The code changes are already in place.
|
||||
|
||||
### Ask the user
|
||||
|
||||
1. **Mode**: Replace WhatsApp or add alongside it?
|
||||
- Replace → will set `TELEGRAM_ONLY=true`
|
||||
- Alongside → both channels active (default)
|
||||
|
||||
2. **Do they already have a bot token?** If yes, collect it now. If no, we'll create one in Phase 3.
|
||||
|
||||
## Phase 2: Apply Code Changes
|
||||
|
||||
Run the skills engine to apply this skill's code package. The package files are in this directory alongside this SKILL.md.
|
||||
|
||||
### Initialize skills system (if needed)
|
||||
|
||||
If `.nanoclaw/` directory doesn't exist yet:
|
||||
|
||||
```bash
|
||||
npm install grammy
|
||||
npx tsx scripts/apply-skill.ts --init
|
||||
```
|
||||
|
||||
Grammy is a modern, TypeScript-first Telegram bot framework.
|
||||
Or call `initSkillsSystem()` from `skills-engine/migrate.ts`.
|
||||
|
||||
### 2. Create Telegram Bot
|
||||
### Apply the skill
|
||||
|
||||
Tell the user:
|
||||
```bash
|
||||
npx tsx scripts/apply-skill.ts .claude/skills/add-telegram
|
||||
```
|
||||
|
||||
This deterministically:
|
||||
- Adds `src/channels/telegram.ts` (TelegramChannel class implementing Channel interface)
|
||||
- Adds `src/channels/telegram.test.ts` (46 unit tests)
|
||||
- Three-way merges Telegram support into `src/index.ts` (multi-channel support, findChannel routing)
|
||||
- Three-way merges Telegram config into `src/config.ts` (TELEGRAM_BOT_TOKEN, TELEGRAM_ONLY exports)
|
||||
- Three-way merges updated routing tests into `src/routing.test.ts`
|
||||
- Installs the `grammy` npm dependency
|
||||
- Updates `.env.example` with `TELEGRAM_BOT_TOKEN` and `TELEGRAM_ONLY`
|
||||
- Records the application in `.nanoclaw/state.yaml`
|
||||
|
||||
If the apply reports merge conflicts, read the intent files:
|
||||
- `modify/src/index.ts.intent.md` — what changed and invariants for index.ts
|
||||
- `modify/src/config.ts.intent.md` — what changed for config.ts
|
||||
|
||||
### Validate code changes
|
||||
|
||||
```bash
|
||||
npm test
|
||||
npm run build
|
||||
```
|
||||
|
||||
All tests must pass (including the new telegram tests) and build must be clean before proceeding.
|
||||
|
||||
## Phase 3: Setup
|
||||
|
||||
### Create Telegram Bot (if needed)
|
||||
|
||||
If the user doesn't have a bot token, tell them:
|
||||
|
||||
> I need you to create a Telegram bot:
|
||||
>
|
||||
@@ -34,531 +78,92 @@ Tell the user:
|
||||
> - Bot username: Must end with "bot" (e.g., "andy_ai_bot")
|
||||
> 3. Copy the bot token (looks like `123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11`)
|
||||
|
||||
Wait for user to provide the token.
|
||||
Wait for the user to provide the token.
|
||||
|
||||
### 3. Get Chat ID
|
||||
|
||||
Tell the user:
|
||||
|
||||
> To register a chat, you need its Chat ID. Here's how:
|
||||
>
|
||||
> **For Private Chat (DM with bot):**
|
||||
> 1. Search for your bot in Telegram
|
||||
> 2. Start a chat and send any message
|
||||
> 3. I'll add a `/chatid` command to help you get the ID
|
||||
>
|
||||
> **For Group Chat:**
|
||||
> 1. Add your bot to the group
|
||||
> 2. Send any message
|
||||
> 3. Use the `/chatid` command in the group
|
||||
|
||||
### 4. Disable Group Privacy (for group chats)
|
||||
|
||||
Tell the user:
|
||||
|
||||
> **Important for group chats**: By default, Telegram bots in groups only receive messages that @mention the bot or are commands. To let the bot see all messages (needed for `requiresTrigger: false` or trigger-word detection):
|
||||
>
|
||||
> 1. Open Telegram and search for `@BotFather`
|
||||
> 2. Send `/mybots` and select your bot
|
||||
> 3. Go to **Bot Settings** > **Group Privacy**
|
||||
> 4. Select **Turn off**
|
||||
>
|
||||
> Without this, the bot will only see messages that directly @mention it.
|
||||
|
||||
This step is optional if the user only wants trigger-based responses via @mentioning the bot.
|
||||
|
||||
## Questions to Ask
|
||||
|
||||
Before making changes, ask:
|
||||
|
||||
1. **Mode**: Replace WhatsApp or add alongside it?
|
||||
- If replace: Set `TELEGRAM_ONLY=true`
|
||||
- If alongside: Both will run
|
||||
|
||||
2. **Chat behavior**: Should this chat respond to all messages or only when @mentioned?
|
||||
- Main chat: Responds to all (set `requiresTrigger: false`)
|
||||
- Other chats: Default requires trigger (`requiresTrigger: true`)
|
||||
|
||||
## Architecture
|
||||
|
||||
NanoClaw uses a **Channel abstraction** (`Channel` interface in `src/types.ts`). Each messaging platform implements this interface. Key files:
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `src/types.ts` | `Channel` interface definition |
|
||||
| `src/channels/whatsapp.ts` | `WhatsAppChannel` class (reference implementation) |
|
||||
| `src/router.ts` | `findChannel()`, `routeOutbound()`, `formatOutbound()` |
|
||||
| `src/index.ts` | Orchestrator: creates channels, wires callbacks, starts subsystems |
|
||||
| `src/ipc.ts` | IPC watcher (uses `sendMessage` dep for outbound) |
|
||||
|
||||
The Telegram channel follows the same pattern as WhatsApp:
|
||||
- Implements `Channel` interface (`connect`, `sendMessage`, `ownsJid`, `disconnect`, `setTyping`)
|
||||
- Delivers inbound messages via `onMessage` / `onChatMetadata` callbacks
|
||||
- The existing message loop in `src/index.ts` picks up stored messages automatically
|
||||
|
||||
## Implementation
|
||||
|
||||
### Step 1: Update Configuration
|
||||
|
||||
Read `src/config.ts` and add Telegram config exports:
|
||||
|
||||
```typescript
|
||||
export const TELEGRAM_BOT_TOKEN = process.env.TELEGRAM_BOT_TOKEN || "";
|
||||
export const TELEGRAM_ONLY = process.env.TELEGRAM_ONLY === "true";
|
||||
```
|
||||
|
||||
These should be added near the top with other configuration exports.
|
||||
|
||||
### Step 2: Create Telegram Channel
|
||||
|
||||
Create `src/channels/telegram.ts` implementing the `Channel` interface. Use `src/channels/whatsapp.ts` as a reference for the pattern.
|
||||
|
||||
```typescript
|
||||
import { Bot } from "grammy";
|
||||
|
||||
import {
|
||||
ASSISTANT_NAME,
|
||||
TRIGGER_PATTERN,
|
||||
} from "../config.js";
|
||||
import { logger } from "../logger.js";
|
||||
import { Channel, OnInboundMessage, OnChatMetadata, RegisteredGroup } from "../types.js";
|
||||
|
||||
export interface TelegramChannelOpts {
|
||||
onMessage: OnInboundMessage;
|
||||
onChatMetadata: OnChatMetadata;
|
||||
registeredGroups: () => Record<string, RegisteredGroup>;
|
||||
}
|
||||
|
||||
export class TelegramChannel implements Channel {
|
||||
name = "telegram";
|
||||
prefixAssistantName = false; // Telegram bots already display their name
|
||||
|
||||
private bot: Bot | null = null;
|
||||
private opts: TelegramChannelOpts;
|
||||
private botToken: string;
|
||||
|
||||
constructor(botToken: string, opts: TelegramChannelOpts) {
|
||||
this.botToken = botToken;
|
||||
this.opts = opts;
|
||||
}
|
||||
|
||||
async connect(): Promise<void> {
|
||||
this.bot = new Bot(this.botToken);
|
||||
|
||||
// Command to get chat ID (useful for registration)
|
||||
this.bot.command("chatid", (ctx) => {
|
||||
const chatId = ctx.chat.id;
|
||||
const chatType = ctx.chat.type;
|
||||
const chatName =
|
||||
chatType === "private"
|
||||
? ctx.from?.first_name || "Private"
|
||||
: (ctx.chat as any).title || "Unknown";
|
||||
|
||||
ctx.reply(
|
||||
`Chat ID: \`tg:${chatId}\`\nName: ${chatName}\nType: ${chatType}`,
|
||||
{ parse_mode: "Markdown" },
|
||||
);
|
||||
});
|
||||
|
||||
// Command to check bot status
|
||||
this.bot.command("ping", (ctx) => {
|
||||
ctx.reply(`${ASSISTANT_NAME} is online.`);
|
||||
});
|
||||
|
||||
this.bot.on("message:text", async (ctx) => {
|
||||
// Skip commands
|
||||
if (ctx.message.text.startsWith("/")) return;
|
||||
|
||||
const chatJid = `tg:${ctx.chat.id}`;
|
||||
let content = ctx.message.text;
|
||||
const timestamp = new Date(ctx.message.date * 1000).toISOString();
|
||||
const senderName =
|
||||
ctx.from?.first_name ||
|
||||
ctx.from?.username ||
|
||||
ctx.from?.id.toString() ||
|
||||
"Unknown";
|
||||
const sender = ctx.from?.id.toString() || "";
|
||||
const msgId = ctx.message.message_id.toString();
|
||||
|
||||
// Determine chat name
|
||||
const chatName =
|
||||
ctx.chat.type === "private"
|
||||
? senderName
|
||||
: (ctx.chat as any).title || chatJid;
|
||||
|
||||
// Translate Telegram @bot_username mentions into TRIGGER_PATTERN format.
|
||||
// Telegram @mentions (e.g., @andy_ai_bot) won't match TRIGGER_PATTERN
|
||||
// (e.g., ^@Andy\b), so we prepend the trigger when the bot is @mentioned.
|
||||
const botUsername = ctx.me?.username?.toLowerCase();
|
||||
if (botUsername) {
|
||||
const entities = ctx.message.entities || [];
|
||||
const isBotMentioned = entities.some((entity) => {
|
||||
if (entity.type === "mention") {
|
||||
const mentionText = content
|
||||
.substring(entity.offset, entity.offset + entity.length)
|
||||
.toLowerCase();
|
||||
return mentionText === `@${botUsername}`;
|
||||
}
|
||||
return false;
|
||||
});
|
||||
if (isBotMentioned && !TRIGGER_PATTERN.test(content)) {
|
||||
content = `@${ASSISTANT_NAME} ${content}`;
|
||||
}
|
||||
}
|
||||
|
||||
// Store chat metadata for discovery
|
||||
this.opts.onChatMetadata(chatJid, timestamp, chatName);
|
||||
|
||||
// Only deliver full message for registered groups
|
||||
const group = this.opts.registeredGroups()[chatJid];
|
||||
if (!group) {
|
||||
logger.debug(
|
||||
{ chatJid, chatName },
|
||||
"Message from unregistered Telegram chat",
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// Deliver message — startMessageLoop() will pick it up
|
||||
this.opts.onMessage(chatJid, {
|
||||
id: msgId,
|
||||
chat_jid: chatJid,
|
||||
sender,
|
||||
sender_name: senderName,
|
||||
content,
|
||||
timestamp,
|
||||
is_from_me: false,
|
||||
});
|
||||
|
||||
logger.info(
|
||||
{ chatJid, chatName, sender: senderName },
|
||||
"Telegram message stored",
|
||||
);
|
||||
});
|
||||
|
||||
// Handle non-text messages with placeholders so the agent knows something was sent
|
||||
const storeNonText = (ctx: any, placeholder: string) => {
|
||||
const chatJid = `tg:${ctx.chat.id}`;
|
||||
const group = this.opts.registeredGroups()[chatJid];
|
||||
if (!group) return;
|
||||
|
||||
const timestamp = new Date(ctx.message.date * 1000).toISOString();
|
||||
const senderName =
|
||||
ctx.from?.first_name || ctx.from?.username || ctx.from?.id?.toString() || "Unknown";
|
||||
const caption = ctx.message.caption ? ` ${ctx.message.caption}` : "";
|
||||
|
||||
this.opts.onChatMetadata(chatJid, timestamp);
|
||||
this.opts.onMessage(chatJid, {
|
||||
id: ctx.message.message_id.toString(),
|
||||
chat_jid: chatJid,
|
||||
sender: ctx.from?.id?.toString() || "",
|
||||
sender_name: senderName,
|
||||
content: `${placeholder}${caption}`,
|
||||
timestamp,
|
||||
is_from_me: false,
|
||||
});
|
||||
};
|
||||
|
||||
this.bot.on("message:photo", (ctx) => storeNonText(ctx, "[Photo]"));
|
||||
this.bot.on("message:video", (ctx) => storeNonText(ctx, "[Video]"));
|
||||
this.bot.on("message:voice", (ctx) => storeNonText(ctx, "[Voice message]"));
|
||||
this.bot.on("message:audio", (ctx) => storeNonText(ctx, "[Audio]"));
|
||||
this.bot.on("message:document", (ctx) => {
|
||||
const name = ctx.message.document?.file_name || "file";
|
||||
storeNonText(ctx, `[Document: ${name}]`);
|
||||
});
|
||||
this.bot.on("message:sticker", (ctx) => {
|
||||
const emoji = ctx.message.sticker?.emoji || "";
|
||||
storeNonText(ctx, `[Sticker ${emoji}]`);
|
||||
});
|
||||
this.bot.on("message:location", (ctx) => storeNonText(ctx, "[Location]"));
|
||||
this.bot.on("message:contact", (ctx) => storeNonText(ctx, "[Contact]"));
|
||||
|
||||
// Handle errors gracefully
|
||||
this.bot.catch((err) => {
|
||||
logger.error({ err: err.message }, "Telegram bot error");
|
||||
});
|
||||
|
||||
// Start polling — returns a Promise that resolves when started
|
||||
return new Promise<void>((resolve) => {
|
||||
this.bot!.start({
|
||||
onStart: (botInfo) => {
|
||||
logger.info(
|
||||
{ username: botInfo.username, id: botInfo.id },
|
||||
"Telegram bot connected",
|
||||
);
|
||||
console.log(`\n Telegram bot: @${botInfo.username}`);
|
||||
console.log(
|
||||
` Send /chatid to the bot to get a chat's registration ID\n`,
|
||||
);
|
||||
resolve();
|
||||
},
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async sendMessage(jid: string, text: string): Promise<void> {
|
||||
if (!this.bot) {
|
||||
logger.warn("Telegram bot not initialized");
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const numericId = jid.replace(/^tg:/, "");
|
||||
|
||||
// Telegram has a 4096 character limit per message — split if needed
|
||||
const MAX_LENGTH = 4096;
|
||||
if (text.length <= MAX_LENGTH) {
|
||||
await this.bot.api.sendMessage(numericId, text);
|
||||
} else {
|
||||
for (let i = 0; i < text.length; i += MAX_LENGTH) {
|
||||
await this.bot.api.sendMessage(numericId, text.slice(i, i + MAX_LENGTH));
|
||||
}
|
||||
}
|
||||
logger.info({ jid, length: text.length }, "Telegram message sent");
|
||||
} catch (err) {
|
||||
logger.error({ jid, err }, "Failed to send Telegram message");
|
||||
}
|
||||
}
|
||||
|
||||
isConnected(): boolean {
|
||||
return this.bot !== null;
|
||||
}
|
||||
|
||||
ownsJid(jid: string): boolean {
|
||||
return jid.startsWith("tg:");
|
||||
}
|
||||
|
||||
async disconnect(): Promise<void> {
|
||||
if (this.bot) {
|
||||
this.bot.stop();
|
||||
this.bot = null;
|
||||
logger.info("Telegram bot stopped");
|
||||
}
|
||||
}
|
||||
|
||||
async setTyping(jid: string, isTyping: boolean): Promise<void> {
|
||||
if (!this.bot || !isTyping) return;
|
||||
try {
|
||||
const numericId = jid.replace(/^tg:/, "");
|
||||
await this.bot.api.sendChatAction(numericId, "typing");
|
||||
} catch (err) {
|
||||
logger.debug({ jid, err }, "Failed to send Telegram typing indicator");
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Key differences from the old standalone `src/telegram.ts`:
|
||||
- Implements `Channel` interface — same pattern as `WhatsAppChannel`
|
||||
- Uses `onMessage` / `onChatMetadata` callbacks instead of importing DB functions directly
|
||||
- Registration check via `registeredGroups()` callback, not `getAllRegisteredGroups()`
|
||||
- `prefixAssistantName = false` — Telegram bots already show their name, so `formatOutbound()` skips the prefix
|
||||
- No `storeMessageDirect` needed — `storeMessage()` in db.ts already accepts `NewMessage` directly
|
||||
|
||||
### Step 3: Update Main Application
|
||||
|
||||
Modify `src/index.ts` to support multiple channels. Read the file first to understand the current structure.
|
||||
|
||||
1. **Add imports** at the top:
|
||||
|
||||
```typescript
|
||||
import { TelegramChannel } from "./channels/telegram.js";
|
||||
import { TELEGRAM_BOT_TOKEN, TELEGRAM_ONLY } from "./config.js";
|
||||
import { findChannel } from "./router.js";
|
||||
```
|
||||
|
||||
2. **Add a channels array** alongside the existing `whatsapp` variable:
|
||||
|
||||
```typescript
|
||||
let whatsapp: WhatsAppChannel;
|
||||
const channels: Channel[] = [];
|
||||
```
|
||||
|
||||
Import `Channel` from `./types.js` if not already imported.
|
||||
|
||||
3. **Update `processGroupMessages`** to find the correct channel for the JID instead of using `whatsapp` directly. Replace the direct `whatsapp.setTyping()` and `whatsapp.sendMessage()` calls:
|
||||
|
||||
```typescript
|
||||
// Find the channel that owns this JID
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) return true; // No channel for this JID
|
||||
|
||||
// ... (existing code for message fetching, trigger check, formatting)
|
||||
|
||||
await channel.setTyping?.(chatJid, true);
|
||||
// ... (existing agent invocation, replacing whatsapp.sendMessage with channel.sendMessage)
|
||||
await channel.setTyping?.(chatJid, false);
|
||||
```
|
||||
|
||||
In the `onOutput` callback inside `processGroupMessages`, replace:
|
||||
```typescript
|
||||
await whatsapp.sendMessage(chatJid, `${ASSISTANT_NAME}: ${text}`);
|
||||
```
|
||||
with:
|
||||
```typescript
|
||||
const formatted = formatOutbound(channel, text);
|
||||
if (formatted) await channel.sendMessage(chatJid, formatted);
|
||||
```
|
||||
|
||||
4. **Update `main()` function** to create channels conditionally and use them for deps:
|
||||
|
||||
```typescript
|
||||
async function main(): Promise<void> {
|
||||
ensureContainerSystemRunning();
|
||||
initDatabase();
|
||||
logger.info('Database initialized');
|
||||
loadState();
|
||||
|
||||
// Graceful shutdown handlers
|
||||
const shutdown = async (signal: string) => {
|
||||
logger.info({ signal }, 'Shutdown signal received');
|
||||
await queue.shutdown(10000);
|
||||
for (const ch of channels) await ch.disconnect();
|
||||
process.exit(0);
|
||||
};
|
||||
process.on('SIGTERM', () => shutdown('SIGTERM'));
|
||||
process.on('SIGINT', () => shutdown('SIGINT'));
|
||||
|
||||
// Channel callbacks (shared by all channels)
|
||||
const channelOpts = {
|
||||
onMessage: (chatJid: string, msg: NewMessage) => storeMessage(msg),
|
||||
onChatMetadata: (chatJid: string, timestamp: string, name?: string) =>
|
||||
storeChatMetadata(chatJid, timestamp, name),
|
||||
registeredGroups: () => registeredGroups,
|
||||
};
|
||||
|
||||
// Create and connect channels
|
||||
if (!TELEGRAM_ONLY) {
|
||||
whatsapp = new WhatsAppChannel(channelOpts);
|
||||
channels.push(whatsapp);
|
||||
await whatsapp.connect();
|
||||
}
|
||||
|
||||
if (TELEGRAM_BOT_TOKEN) {
|
||||
const telegram = new TelegramChannel(TELEGRAM_BOT_TOKEN, channelOpts);
|
||||
channels.push(telegram);
|
||||
await telegram.connect();
|
||||
}
|
||||
|
||||
// Start subsystems
|
||||
startSchedulerLoop({
|
||||
registeredGroups: () => registeredGroups,
|
||||
getSessions: () => sessions,
|
||||
queue,
|
||||
onProcess: (groupJid, proc, containerName, groupFolder) =>
|
||||
queue.registerProcess(groupJid, proc, containerName, groupFolder),
|
||||
sendMessage: async (jid, rawText) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) return;
|
||||
const text = formatOutbound(channel, rawText);
|
||||
if (text) await channel.sendMessage(jid, text);
|
||||
},
|
||||
});
|
||||
startIpcWatcher({
|
||||
sendMessage: (jid, text) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) throw new Error(`No channel for JID: ${jid}`);
|
||||
return channel.sendMessage(jid, text);
|
||||
},
|
||||
registeredGroups: () => registeredGroups,
|
||||
registerGroup,
|
||||
syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(),
|
||||
getAvailableGroups,
|
||||
writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj),
|
||||
});
|
||||
queue.setProcessMessagesFn(processGroupMessages);
|
||||
recoverPendingMessages();
|
||||
startMessageLoop();
|
||||
}
|
||||
```
|
||||
|
||||
5. **Update `getAvailableGroups`** to include Telegram chats:
|
||||
|
||||
```typescript
|
||||
export function getAvailableGroups(): AvailableGroup[] {
|
||||
const chats = getAllChats();
|
||||
const registeredJids = new Set(Object.keys(registeredGroups));
|
||||
|
||||
return chats
|
||||
.filter((c) => c.jid !== '__group_sync__' && (c.jid.endsWith('@g.us') || c.jid.startsWith('tg:')))
|
||||
.map((c) => ({
|
||||
jid: c.jid,
|
||||
name: c.name,
|
||||
lastActivity: c.last_message_time,
|
||||
isRegistered: registeredJids.has(c.jid),
|
||||
}));
|
||||
}
|
||||
```
|
||||
|
||||
### Step 4: Update Environment
|
||||
### Configure environment
|
||||
|
||||
Add to `.env`:
|
||||
|
||||
```bash
|
||||
TELEGRAM_BOT_TOKEN=YOUR_BOT_TOKEN_HERE
|
||||
|
||||
# Optional: Set to "true" to disable WhatsApp entirely
|
||||
# TELEGRAM_ONLY=true
|
||||
TELEGRAM_BOT_TOKEN=<their-token>
|
||||
```
|
||||
|
||||
**Important**: After modifying `.env`, sync to the container environment:
|
||||
If they chose to replace WhatsApp:
|
||||
|
||||
```bash
|
||||
cp .env data/env/env
|
||||
TELEGRAM_ONLY=true
|
||||
```
|
||||
|
||||
Sync to container environment:
|
||||
|
||||
```bash
|
||||
mkdir -p data/env && cp .env data/env/env
|
||||
```
|
||||
|
||||
The container reads environment from `data/env/env`, not `.env` directly.
|
||||
|
||||
### Step 5: Register a Telegram Chat
|
||||
### Disable Group Privacy (for group chats)
|
||||
|
||||
After installing and starting the bot, tell the user:
|
||||
Tell the user:
|
||||
|
||||
> 1. Send `/chatid` to your bot (in private chat or in a group)
|
||||
> 2. Copy the chat ID (e.g., `tg:123456789` or `tg:-1001234567890`)
|
||||
> 3. I'll register it for you
|
||||
> **Important for group chats**: By default, Telegram bots only see @mentions and commands in groups. To let the bot see all messages:
|
||||
>
|
||||
> 1. Open Telegram and search for `@BotFather`
|
||||
> 2. Send `/mybots` and select your bot
|
||||
> 3. Go to **Bot Settings** > **Group Privacy** > **Turn off**
|
||||
>
|
||||
> This is optional if you only want trigger-based responses via @mentioning the bot.
|
||||
|
||||
Registration uses the `registerGroup()` function in `src/index.ts`, which writes to SQLite and creates the group folder structure. Call it like this (or add a one-time script):
|
||||
|
||||
```typescript
|
||||
// For private chat (main group):
|
||||
registerGroup("tg:123456789", {
|
||||
name: "Personal",
|
||||
folder: "main",
|
||||
trigger: `@${ASSISTANT_NAME}`,
|
||||
added_at: new Date().toISOString(),
|
||||
requiresTrigger: false, // main group responds to all messages
|
||||
});
|
||||
|
||||
// For group chat (note negative ID for Telegram groups):
|
||||
registerGroup("tg:-1001234567890", {
|
||||
name: "My Telegram Group",
|
||||
folder: "telegram-group",
|
||||
trigger: `@${ASSISTANT_NAME}`,
|
||||
added_at: new Date().toISOString(),
|
||||
requiresTrigger: true, // only respond when triggered
|
||||
});
|
||||
```
|
||||
|
||||
The `RegisteredGroup` type requires a `trigger` string field and has an optional `requiresTrigger` boolean (defaults to `true`). Set `requiresTrigger: false` for chats that should respond to all messages.
|
||||
|
||||
Alternatively, if the agent is already running in the main group, it can register new groups via IPC using the `register_group` task type.
|
||||
|
||||
### Step 6: Build and Restart
|
||||
### Build and restart
|
||||
|
||||
```bash
|
||||
npm run build
|
||||
launchctl kickstart -k gui/$(id -u)/com.nanoclaw
|
||||
```
|
||||
|
||||
Or for systemd:
|
||||
## Phase 4: Registration
|
||||
|
||||
```bash
|
||||
npm run build
|
||||
systemctl --user restart nanoclaw
|
||||
### Get Chat ID
|
||||
|
||||
Tell the user:
|
||||
|
||||
> 1. Open your bot in Telegram (search for its username)
|
||||
> 2. Send `/chatid` — it will reply with the chat ID
|
||||
> 3. For groups: add the bot to the group first, then send `/chatid` in the group
|
||||
|
||||
Wait for the user to provide the chat ID (format: `tg:123456789` or `tg:-1001234567890`).
|
||||
|
||||
### Register the chat
|
||||
|
||||
Use the IPC register flow or register directly. The chat ID, name, and folder name are needed.
|
||||
|
||||
For a main chat (responds to all messages, uses the `main` folder):
|
||||
|
||||
```typescript
|
||||
registerGroup("tg:<chat-id>", {
|
||||
name: "<chat-name>",
|
||||
folder: "main",
|
||||
trigger: `@${ASSISTANT_NAME}`,
|
||||
added_at: new Date().toISOString(),
|
||||
requiresTrigger: false,
|
||||
});
|
||||
```
|
||||
|
||||
### Step 7: Test
|
||||
For additional chats (trigger-only):
|
||||
|
||||
```typescript
|
||||
registerGroup("tg:<chat-id>", {
|
||||
name: "<chat-name>",
|
||||
folder: "<folder-name>",
|
||||
trigger: `@${ASSISTANT_NAME}`,
|
||||
added_at: new Date().toISOString(),
|
||||
requiresTrigger: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Phase 5: Verify
|
||||
|
||||
### Test the connection
|
||||
|
||||
Tell the user:
|
||||
|
||||
@@ -566,91 +171,37 @@ Tell the user:
|
||||
> - For main chat: Any message works
|
||||
> - For non-main: `@Andy hello` or @mention the bot
|
||||
>
|
||||
> Check logs: `tail -f logs/nanoclaw.log`
|
||||
> The bot should respond within a few seconds.
|
||||
|
||||
## Replace WhatsApp Entirely
|
||||
### Check logs if needed
|
||||
|
||||
If user wants Telegram-only:
|
||||
|
||||
1. Set `TELEGRAM_ONLY=true` in `.env`
|
||||
2. Run `cp .env data/env/env` to sync to container
|
||||
3. The WhatsApp channel is not created — only Telegram
|
||||
4. All services (scheduler, IPC watcher, queue, message loop) start normally
|
||||
5. Optionally remove `@whiskeysockets/baileys` dependency (but it's harmless to keep)
|
||||
|
||||
## Features
|
||||
|
||||
### Chat ID Formats
|
||||
|
||||
- **WhatsApp**: `120363336345536173@g.us` (groups) or `1234567890@s.whatsapp.net` (DM)
|
||||
- **Telegram**: `tg:123456789` (positive for private) or `tg:-1001234567890` (negative for groups)
|
||||
|
||||
### Trigger Options
|
||||
|
||||
The bot responds when:
|
||||
1. Chat has `requiresTrigger: false` in its registration (e.g., main group)
|
||||
2. Bot is @mentioned in Telegram (translated to TRIGGER_PATTERN automatically)
|
||||
3. Message matches TRIGGER_PATTERN directly (e.g., starts with @Andy)
|
||||
|
||||
Telegram @mentions (e.g., `@andy_ai_bot`) are automatically translated: if the bot is @mentioned and the message doesn't already match TRIGGER_PATTERN, the trigger prefix is prepended before storing. This ensures @mentioning the bot always triggers a response.
|
||||
|
||||
**Group Privacy**: The bot must have Group Privacy disabled in BotFather to see non-mention messages in groups. See Prerequisites step 4.
|
||||
|
||||
### Commands
|
||||
|
||||
- `/chatid` - Get chat ID for registration
|
||||
- `/ping` - Check if bot is online
|
||||
```bash
|
||||
tail -f logs/nanoclaw.log
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Bot not responding
|
||||
|
||||
Check:
|
||||
1. `TELEGRAM_BOT_TOKEN` is set in `.env` AND synced to `data/env/env`
|
||||
2. Chat is registered in SQLite (check with: `sqlite3 store/messages.db "SELECT * FROM registered_groups WHERE jid LIKE 'tg:%'"`)
|
||||
3. For non-main chats: message includes trigger pattern
|
||||
1. Check `TELEGRAM_BOT_TOKEN` is set in `.env` AND synced to `data/env/env`
|
||||
2. Check chat is registered: `sqlite3 store/messages.db "SELECT * FROM registered_groups WHERE jid LIKE 'tg:%'"`
|
||||
3. For non-main chats: message must include trigger pattern
|
||||
4. Service is running: `launchctl list | grep nanoclaw`
|
||||
|
||||
### Bot only responds to @mentions in groups
|
||||
|
||||
The bot has Group Privacy enabled (default). It can only see messages that @mention it or are commands. To fix:
|
||||
1. Open `@BotFather` in Telegram
|
||||
2. `/mybots` > select bot > **Bot Settings** > **Group Privacy** > **Turn off**
|
||||
3. Remove and re-add the bot to the group (required for the change to take effect)
|
||||
Group Privacy is enabled (default). Fix:
|
||||
1. `@BotFather` > `/mybots` > select bot > **Bot Settings** > **Group Privacy** > **Turn off**
|
||||
2. Remove and re-add the bot to the group (required for the change to take effect)
|
||||
|
||||
### Getting chat ID
|
||||
|
||||
If `/chatid` doesn't work:
|
||||
- Verify bot token is valid: `curl -s "https://api.telegram.org/bot${TELEGRAM_BOT_TOKEN}/getMe"`
|
||||
- Verify token: `curl -s "https://api.telegram.org/bot${TELEGRAM_BOT_TOKEN}/getMe"`
|
||||
- Check bot is started: `tail -f logs/nanoclaw.log`
|
||||
|
||||
### Service conflicts
|
||||
## After Setup
|
||||
|
||||
If running `npm run dev` while launchd service is active:
|
||||
```bash
|
||||
launchctl unload ~/Library/LaunchAgents/com.nanoclaw.plist
|
||||
npm run dev
|
||||
# When done testing:
|
||||
launchctl load ~/Library/LaunchAgents/com.nanoclaw.plist
|
||||
```
|
||||
Ask the user:
|
||||
|
||||
## Agent Swarms (Teams)
|
||||
|
||||
After completing the Telegram setup, ask the user:
|
||||
|
||||
> Would you like to add Agent Swarm support? Without it, Agent Teams still work — they just operate behind the scenes. With Swarm support, each subagent appears as a different bot in the Telegram group so you can see who's saying what and have interactive team sessions.
|
||||
|
||||
If they say yes, invoke the `/add-telegram-swarm` skill.
|
||||
|
||||
## Removal
|
||||
|
||||
To remove Telegram integration:
|
||||
|
||||
1. Delete `src/channels/telegram.ts`
|
||||
2. Remove `TelegramChannel` import and creation from `src/index.ts`
|
||||
3. Remove `channels` array and revert to using `whatsapp` directly in `processGroupMessages`, scheduler deps, and IPC deps
|
||||
4. Revert `getAvailableGroups()` filter to only include `@g.us` chats
|
||||
5. Remove Telegram config (`TELEGRAM_BOT_TOKEN`, `TELEGRAM_ONLY`) from `src/config.ts`
|
||||
6. Remove Telegram registrations from SQLite: `sqlite3 store/messages.db "DELETE FROM registered_groups WHERE jid LIKE 'tg:%'"`
|
||||
7. Uninstall: `npm uninstall grammy`
|
||||
8. Rebuild: `npm run build && launchctl kickstart -k gui/$(id -u)/com.nanoclaw`
|
||||
> Would you like to add Agent Swarm support? Each subagent appears as a different bot in the Telegram group. If interested, run `/add-telegram-swarm`.
|
||||
|
||||
918
.claude/skills/add-telegram/add/src/channels/telegram.test.ts
Normal file
918
.claude/skills/add-telegram/add/src/channels/telegram.test.ts
Normal file
@@ -0,0 +1,918 @@
|
||||
import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest';
|
||||
|
||||
// --- Mocks ---
|
||||
|
||||
// Mock config
|
||||
vi.mock('../config.js', () => ({
|
||||
ASSISTANT_NAME: 'Andy',
|
||||
TRIGGER_PATTERN: /^@Andy\b/i,
|
||||
}));
|
||||
|
||||
// Mock logger
|
||||
vi.mock('../logger.js', () => ({
|
||||
logger: {
|
||||
debug: vi.fn(),
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
},
|
||||
}));
|
||||
|
||||
// --- Grammy mock ---
|
||||
|
||||
type Handler = (...args: any[]) => any;
|
||||
|
||||
const botRef = vi.hoisted(() => ({ current: null as any }));
|
||||
|
||||
vi.mock('grammy', () => ({
|
||||
Bot: class MockBot {
|
||||
token: string;
|
||||
commandHandlers = new Map<string, Handler>();
|
||||
filterHandlers = new Map<string, Handler[]>();
|
||||
errorHandler: Handler | null = null;
|
||||
|
||||
api = {
|
||||
sendMessage: vi.fn().mockResolvedValue(undefined),
|
||||
sendChatAction: vi.fn().mockResolvedValue(undefined),
|
||||
};
|
||||
|
||||
constructor(token: string) {
|
||||
this.token = token;
|
||||
botRef.current = this;
|
||||
}
|
||||
|
||||
command(name: string, handler: Handler) {
|
||||
this.commandHandlers.set(name, handler);
|
||||
}
|
||||
|
||||
on(filter: string, handler: Handler) {
|
||||
const existing = this.filterHandlers.get(filter) || [];
|
||||
existing.push(handler);
|
||||
this.filterHandlers.set(filter, existing);
|
||||
}
|
||||
|
||||
catch(handler: Handler) {
|
||||
this.errorHandler = handler;
|
||||
}
|
||||
|
||||
start(opts: { onStart: (botInfo: any) => void }) {
|
||||
opts.onStart({ username: 'andy_ai_bot', id: 12345 });
|
||||
}
|
||||
|
||||
stop() {}
|
||||
},
|
||||
}));
|
||||
|
||||
import { TelegramChannel, TelegramChannelOpts } from './telegram.js';
|
||||
|
||||
// --- Test helpers ---
|
||||
|
||||
function createTestOpts(
|
||||
overrides?: Partial<TelegramChannelOpts>,
|
||||
): TelegramChannelOpts {
|
||||
return {
|
||||
onMessage: vi.fn(),
|
||||
onChatMetadata: vi.fn(),
|
||||
registeredGroups: vi.fn(() => ({
|
||||
'tg:100200300': {
|
||||
name: 'Test Group',
|
||||
folder: 'test-group',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
})),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function createTextCtx(overrides: {
|
||||
chatId?: number;
|
||||
chatType?: string;
|
||||
chatTitle?: string;
|
||||
text: string;
|
||||
fromId?: number;
|
||||
firstName?: string;
|
||||
username?: string;
|
||||
messageId?: number;
|
||||
date?: number;
|
||||
entities?: any[];
|
||||
}) {
|
||||
const chatId = overrides.chatId ?? 100200300;
|
||||
const chatType = overrides.chatType ?? 'group';
|
||||
return {
|
||||
chat: {
|
||||
id: chatId,
|
||||
type: chatType,
|
||||
title: overrides.chatTitle ?? 'Test Group',
|
||||
},
|
||||
from: {
|
||||
id: overrides.fromId ?? 99001,
|
||||
first_name: overrides.firstName ?? 'Alice',
|
||||
username: overrides.username ?? 'alice_user',
|
||||
},
|
||||
message: {
|
||||
text: overrides.text,
|
||||
date: overrides.date ?? Math.floor(Date.now() / 1000),
|
||||
message_id: overrides.messageId ?? 1,
|
||||
entities: overrides.entities ?? [],
|
||||
},
|
||||
me: { username: 'andy_ai_bot' },
|
||||
reply: vi.fn(),
|
||||
};
|
||||
}
|
||||
|
||||
function createMediaCtx(overrides: {
|
||||
chatId?: number;
|
||||
chatType?: string;
|
||||
fromId?: number;
|
||||
firstName?: string;
|
||||
date?: number;
|
||||
messageId?: number;
|
||||
caption?: string;
|
||||
extra?: Record<string, any>;
|
||||
}) {
|
||||
const chatId = overrides.chatId ?? 100200300;
|
||||
return {
|
||||
chat: {
|
||||
id: chatId,
|
||||
type: overrides.chatType ?? 'group',
|
||||
title: 'Test Group',
|
||||
},
|
||||
from: {
|
||||
id: overrides.fromId ?? 99001,
|
||||
first_name: overrides.firstName ?? 'Alice',
|
||||
username: 'alice_user',
|
||||
},
|
||||
message: {
|
||||
date: overrides.date ?? Math.floor(Date.now() / 1000),
|
||||
message_id: overrides.messageId ?? 1,
|
||||
caption: overrides.caption,
|
||||
...(overrides.extra || {}),
|
||||
},
|
||||
me: { username: 'andy_ai_bot' },
|
||||
};
|
||||
}
|
||||
|
||||
function currentBot() {
|
||||
return botRef.current;
|
||||
}
|
||||
|
||||
async function triggerTextMessage(ctx: ReturnType<typeof createTextCtx>) {
|
||||
const handlers = currentBot().filterHandlers.get('message:text') || [];
|
||||
for (const h of handlers) await h(ctx);
|
||||
}
|
||||
|
||||
async function triggerMediaMessage(
|
||||
filter: string,
|
||||
ctx: ReturnType<typeof createMediaCtx>,
|
||||
) {
|
||||
const handlers = currentBot().filterHandlers.get(filter) || [];
|
||||
for (const h of handlers) await h(ctx);
|
||||
}
|
||||
|
||||
// --- Tests ---
|
||||
|
||||
describe('TelegramChannel', () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
vi.restoreAllMocks();
|
||||
});
|
||||
|
||||
// --- Connection lifecycle ---
|
||||
|
||||
describe('connection lifecycle', () => {
|
||||
it('resolves connect() when bot starts', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
|
||||
expect(channel.isConnected()).toBe(true);
|
||||
});
|
||||
|
||||
it('registers command and message handlers on connect', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
|
||||
expect(currentBot().commandHandlers.has('chatid')).toBe(true);
|
||||
expect(currentBot().commandHandlers.has('ping')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:text')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:photo')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:video')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:voice')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:audio')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:document')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:sticker')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:location')).toBe(true);
|
||||
expect(currentBot().filterHandlers.has('message:contact')).toBe(true);
|
||||
});
|
||||
|
||||
it('registers error handler on connect', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
|
||||
expect(currentBot().errorHandler).not.toBeNull();
|
||||
});
|
||||
|
||||
it('disconnects cleanly', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
await channel.connect();
|
||||
expect(channel.isConnected()).toBe(true);
|
||||
|
||||
await channel.disconnect();
|
||||
expect(channel.isConnected()).toBe(false);
|
||||
});
|
||||
|
||||
it('isConnected() returns false before connect', () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
expect(channel.isConnected()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// --- Text message handling ---
|
||||
|
||||
describe('text message handling', () => {
|
||||
it('delivers message for registered group', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ text: 'Hello everyone' });
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.any(String),
|
||||
'Test Group',
|
||||
);
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
id: '1',
|
||||
chat_jid: 'tg:100200300',
|
||||
sender: '99001',
|
||||
sender_name: 'Alice',
|
||||
content: 'Hello everyone',
|
||||
is_from_me: false,
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('only emits metadata for unregistered chats', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ chatId: 999999, text: 'Unknown chat' });
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'tg:999999',
|
||||
expect.any(String),
|
||||
'Test Group',
|
||||
);
|
||||
expect(opts.onMessage).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('skips command messages (starting with /)', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ text: '/start' });
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).not.toHaveBeenCalled();
|
||||
expect(opts.onChatMetadata).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('extracts sender name from first_name', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ text: 'Hi', firstName: 'Bob' });
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ sender_name: 'Bob' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('falls back to username when first_name missing', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ text: 'Hi' });
|
||||
ctx.from.first_name = undefined as any;
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ sender_name: 'alice_user' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('falls back to user ID when name and username missing', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ text: 'Hi', fromId: 42 });
|
||||
ctx.from.first_name = undefined as any;
|
||||
ctx.from.username = undefined as any;
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ sender_name: '42' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('uses sender name as chat name for private chats', async () => {
|
||||
const opts = createTestOpts({
|
||||
registeredGroups: vi.fn(() => ({
|
||||
'tg:100200300': {
|
||||
name: 'Private',
|
||||
folder: 'private',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
})),
|
||||
});
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: 'Hello',
|
||||
chatType: 'private',
|
||||
firstName: 'Alice',
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.any(String),
|
||||
'Alice', // Private chats use sender name
|
||||
);
|
||||
});
|
||||
|
||||
it('uses chat title as name for group chats', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: 'Hello',
|
||||
chatType: 'supergroup',
|
||||
chatTitle: 'Project Team',
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.any(String),
|
||||
'Project Team',
|
||||
);
|
||||
});
|
||||
|
||||
it('converts message.date to ISO timestamp', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const unixTime = 1704067200; // 2024-01-01T00:00:00.000Z
|
||||
const ctx = createTextCtx({ text: 'Hello', date: unixTime });
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
timestamp: '2024-01-01T00:00:00.000Z',
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// --- @mention translation ---
|
||||
|
||||
describe('@mention translation', () => {
|
||||
it('translates @bot_username mention to trigger format', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: '@andy_ai_bot what time is it?',
|
||||
entities: [{ type: 'mention', offset: 0, length: 12 }],
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
content: '@Andy @andy_ai_bot what time is it?',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('does not translate if message already matches trigger', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: '@Andy @andy_ai_bot hello',
|
||||
entities: [{ type: 'mention', offset: 6, length: 12 }],
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
// Should NOT double-prepend — already starts with @Andy
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
content: '@Andy @andy_ai_bot hello',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('does not translate mentions of other bots', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: '@some_other_bot hi',
|
||||
entities: [{ type: 'mention', offset: 0, length: 15 }],
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
content: '@some_other_bot hi', // No translation
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('handles mention in middle of message', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: 'hey @andy_ai_bot check this',
|
||||
entities: [{ type: 'mention', offset: 4, length: 12 }],
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
// Bot is mentioned, message doesn't match trigger → prepend trigger
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
content: '@Andy hey @andy_ai_bot check this',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('handles message with no entities', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({ text: 'plain message' });
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
content: 'plain message',
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it('ignores non-mention entities', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createTextCtx({
|
||||
text: 'check https://example.com',
|
||||
entities: [{ type: 'url', offset: 6, length: 19 }],
|
||||
});
|
||||
await triggerTextMessage(ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({
|
||||
content: 'check https://example.com',
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// --- Non-text messages ---
|
||||
|
||||
describe('non-text messages', () => {
|
||||
it('stores photo with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({});
|
||||
await triggerMediaMessage('message:photo', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Photo]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores photo with caption', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({ caption: 'Look at this' });
|
||||
await triggerMediaMessage('message:photo', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Photo] Look at this' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores video with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({});
|
||||
await triggerMediaMessage('message:video', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Video]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores voice message with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({});
|
||||
await triggerMediaMessage('message:voice', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Voice message]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores audio with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({});
|
||||
await triggerMediaMessage('message:audio', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Audio]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores document with filename', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({
|
||||
extra: { document: { file_name: 'report.pdf' } },
|
||||
});
|
||||
await triggerMediaMessage('message:document', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Document: report.pdf]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores document with fallback name when filename missing', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({ extra: { document: {} } });
|
||||
await triggerMediaMessage('message:document', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Document: file]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores sticker with emoji', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({
|
||||
extra: { sticker: { emoji: '😂' } },
|
||||
});
|
||||
await triggerMediaMessage('message:sticker', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Sticker 😂]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores location with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({});
|
||||
await triggerMediaMessage('message:location', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Location]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('stores contact with placeholder', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({});
|
||||
await triggerMediaMessage('message:contact', ctx);
|
||||
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'tg:100200300',
|
||||
expect.objectContaining({ content: '[Contact]' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('ignores non-text messages from unregistered chats', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const ctx = createMediaCtx({ chatId: 999999 });
|
||||
await triggerMediaMessage('message:photo', ctx);
|
||||
|
||||
expect(opts.onMessage).not.toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
||||
// --- sendMessage ---
|
||||
|
||||
describe('sendMessage', () => {
|
||||
it('sends message via bot API', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.sendMessage('tg:100200300', 'Hello');
|
||||
|
||||
expect(currentBot().api.sendMessage).toHaveBeenCalledWith(
|
||||
'100200300',
|
||||
'Hello',
|
||||
);
|
||||
});
|
||||
|
||||
it('strips tg: prefix from JID', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.sendMessage('tg:-1001234567890', 'Group message');
|
||||
|
||||
expect(currentBot().api.sendMessage).toHaveBeenCalledWith(
|
||||
'-1001234567890',
|
||||
'Group message',
|
||||
);
|
||||
});
|
||||
|
||||
it('splits messages exceeding 4096 characters', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const longText = 'x'.repeat(5000);
|
||||
await channel.sendMessage('tg:100200300', longText);
|
||||
|
||||
expect(currentBot().api.sendMessage).toHaveBeenCalledTimes(2);
|
||||
expect(currentBot().api.sendMessage).toHaveBeenNthCalledWith(
|
||||
1,
|
||||
'100200300',
|
||||
'x'.repeat(4096),
|
||||
);
|
||||
expect(currentBot().api.sendMessage).toHaveBeenNthCalledWith(
|
||||
2,
|
||||
'100200300',
|
||||
'x'.repeat(904),
|
||||
);
|
||||
});
|
||||
|
||||
it('sends exactly one message at 4096 characters', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const exactText = 'y'.repeat(4096);
|
||||
await channel.sendMessage('tg:100200300', exactText);
|
||||
|
||||
expect(currentBot().api.sendMessage).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it('handles send failure gracefully', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
currentBot().api.sendMessage.mockRejectedValueOnce(
|
||||
new Error('Network error'),
|
||||
);
|
||||
|
||||
// Should not throw
|
||||
await expect(
|
||||
channel.sendMessage('tg:100200300', 'Will fail'),
|
||||
).resolves.toBeUndefined();
|
||||
});
|
||||
|
||||
it('does nothing when bot is not initialized', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
// Don't connect — bot is null
|
||||
await channel.sendMessage('tg:100200300', 'No bot');
|
||||
|
||||
// No error, no API call
|
||||
});
|
||||
});
|
||||
|
||||
// --- ownsJid ---
|
||||
|
||||
describe('ownsJid', () => {
|
||||
it('owns tg: JIDs', () => {
|
||||
const channel = new TelegramChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('tg:123456')).toBe(true);
|
||||
});
|
||||
|
||||
it('owns tg: JIDs with negative IDs (groups)', () => {
|
||||
const channel = new TelegramChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('tg:-1001234567890')).toBe(true);
|
||||
});
|
||||
|
||||
it('does not own WhatsApp group JIDs', () => {
|
||||
const channel = new TelegramChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('12345@g.us')).toBe(false);
|
||||
});
|
||||
|
||||
it('does not own WhatsApp DM JIDs', () => {
|
||||
const channel = new TelegramChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('12345@s.whatsapp.net')).toBe(false);
|
||||
});
|
||||
|
||||
it('does not own unknown JID formats', () => {
|
||||
const channel = new TelegramChannel('test-token', createTestOpts());
|
||||
expect(channel.ownsJid('random-string')).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// --- setTyping ---
|
||||
|
||||
describe('setTyping', () => {
|
||||
it('sends typing action when isTyping is true', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.setTyping('tg:100200300', true);
|
||||
|
||||
expect(currentBot().api.sendChatAction).toHaveBeenCalledWith(
|
||||
'100200300',
|
||||
'typing',
|
||||
);
|
||||
});
|
||||
|
||||
it('does nothing when isTyping is false', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
await channel.setTyping('tg:100200300', false);
|
||||
|
||||
expect(currentBot().api.sendChatAction).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('does nothing when bot is not initialized', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
|
||||
// Don't connect
|
||||
await channel.setTyping('tg:100200300', true);
|
||||
|
||||
// No error, no API call
|
||||
});
|
||||
|
||||
it('handles typing indicator failure gracefully', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
currentBot().api.sendChatAction.mockRejectedValueOnce(
|
||||
new Error('Rate limited'),
|
||||
);
|
||||
|
||||
await expect(
|
||||
channel.setTyping('tg:100200300', true),
|
||||
).resolves.toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
// --- Bot commands ---
|
||||
|
||||
describe('bot commands', () => {
|
||||
it('/chatid replies with chat ID and metadata', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const handler = currentBot().commandHandlers.get('chatid')!;
|
||||
const ctx = {
|
||||
chat: { id: 100200300, type: 'group' as const },
|
||||
from: { first_name: 'Alice' },
|
||||
reply: vi.fn(),
|
||||
};
|
||||
|
||||
await handler(ctx);
|
||||
|
||||
expect(ctx.reply).toHaveBeenCalledWith(
|
||||
expect.stringContaining('tg:100200300'),
|
||||
expect.objectContaining({ parse_mode: 'Markdown' }),
|
||||
);
|
||||
});
|
||||
|
||||
it('/chatid shows chat type', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const handler = currentBot().commandHandlers.get('chatid')!;
|
||||
const ctx = {
|
||||
chat: { id: 555, type: 'private' as const },
|
||||
from: { first_name: 'Bob' },
|
||||
reply: vi.fn(),
|
||||
};
|
||||
|
||||
await handler(ctx);
|
||||
|
||||
expect(ctx.reply).toHaveBeenCalledWith(
|
||||
expect.stringContaining('private'),
|
||||
expect.any(Object),
|
||||
);
|
||||
});
|
||||
|
||||
it('/ping replies with bot status', async () => {
|
||||
const opts = createTestOpts();
|
||||
const channel = new TelegramChannel('test-token', opts);
|
||||
await channel.connect();
|
||||
|
||||
const handler = currentBot().commandHandlers.get('ping')!;
|
||||
const ctx = { reply: vi.fn() };
|
||||
|
||||
await handler(ctx);
|
||||
|
||||
expect(ctx.reply).toHaveBeenCalledWith('Andy is online.');
|
||||
});
|
||||
});
|
||||
|
||||
// --- Channel properties ---
|
||||
|
||||
describe('channel properties', () => {
|
||||
it('has name "telegram"', () => {
|
||||
const channel = new TelegramChannel('test-token', createTestOpts());
|
||||
expect(channel.name).toBe('telegram');
|
||||
});
|
||||
});
|
||||
});
|
||||
242
.claude/skills/add-telegram/add/src/channels/telegram.ts
Normal file
242
.claude/skills/add-telegram/add/src/channels/telegram.ts
Normal file
@@ -0,0 +1,242 @@
|
||||
import { Bot } from 'grammy';
|
||||
|
||||
import { ASSISTANT_NAME, TRIGGER_PATTERN } from '../config.js';
|
||||
import { logger } from '../logger.js';
|
||||
import {
|
||||
Channel,
|
||||
OnChatMetadata,
|
||||
OnInboundMessage,
|
||||
RegisteredGroup,
|
||||
} from '../types.js';
|
||||
|
||||
export interface TelegramChannelOpts {
|
||||
onMessage: OnInboundMessage;
|
||||
onChatMetadata: OnChatMetadata;
|
||||
registeredGroups: () => Record<string, RegisteredGroup>;
|
||||
}
|
||||
|
||||
export class TelegramChannel implements Channel {
|
||||
name = 'telegram';
|
||||
|
||||
private bot: Bot | null = null;
|
||||
private opts: TelegramChannelOpts;
|
||||
private botToken: string;
|
||||
|
||||
constructor(botToken: string, opts: TelegramChannelOpts) {
|
||||
this.botToken = botToken;
|
||||
this.opts = opts;
|
||||
}
|
||||
|
||||
async connect(): Promise<void> {
|
||||
this.bot = new Bot(this.botToken);
|
||||
|
||||
// Command to get chat ID (useful for registration)
|
||||
this.bot.command('chatid', (ctx) => {
|
||||
const chatId = ctx.chat.id;
|
||||
const chatType = ctx.chat.type;
|
||||
const chatName =
|
||||
chatType === 'private'
|
||||
? ctx.from?.first_name || 'Private'
|
||||
: (ctx.chat as any).title || 'Unknown';
|
||||
|
||||
ctx.reply(
|
||||
`Chat ID: \`tg:${chatId}\`\nName: ${chatName}\nType: ${chatType}`,
|
||||
{ parse_mode: 'Markdown' },
|
||||
);
|
||||
});
|
||||
|
||||
// Command to check bot status
|
||||
this.bot.command('ping', (ctx) => {
|
||||
ctx.reply(`${ASSISTANT_NAME} is online.`);
|
||||
});
|
||||
|
||||
this.bot.on('message:text', async (ctx) => {
|
||||
// Skip commands
|
||||
if (ctx.message.text.startsWith('/')) return;
|
||||
|
||||
const chatJid = `tg:${ctx.chat.id}`;
|
||||
let content = ctx.message.text;
|
||||
const timestamp = new Date(ctx.message.date * 1000).toISOString();
|
||||
const senderName =
|
||||
ctx.from?.first_name ||
|
||||
ctx.from?.username ||
|
||||
ctx.from?.id.toString() ||
|
||||
'Unknown';
|
||||
const sender = ctx.from?.id.toString() || '';
|
||||
const msgId = ctx.message.message_id.toString();
|
||||
|
||||
// Determine chat name
|
||||
const chatName =
|
||||
ctx.chat.type === 'private'
|
||||
? senderName
|
||||
: (ctx.chat as any).title || chatJid;
|
||||
|
||||
// Translate Telegram @bot_username mentions into TRIGGER_PATTERN format.
|
||||
// Telegram @mentions (e.g., @andy_ai_bot) won't match TRIGGER_PATTERN
|
||||
// (e.g., ^@Andy\b), so we prepend the trigger when the bot is @mentioned.
|
||||
const botUsername = ctx.me?.username?.toLowerCase();
|
||||
if (botUsername) {
|
||||
const entities = ctx.message.entities || [];
|
||||
const isBotMentioned = entities.some((entity) => {
|
||||
if (entity.type === 'mention') {
|
||||
const mentionText = content
|
||||
.substring(entity.offset, entity.offset + entity.length)
|
||||
.toLowerCase();
|
||||
return mentionText === `@${botUsername}`;
|
||||
}
|
||||
return false;
|
||||
});
|
||||
if (isBotMentioned && !TRIGGER_PATTERN.test(content)) {
|
||||
content = `@${ASSISTANT_NAME} ${content}`;
|
||||
}
|
||||
}
|
||||
|
||||
// Store chat metadata for discovery
|
||||
this.opts.onChatMetadata(chatJid, timestamp, chatName);
|
||||
|
||||
// Only deliver full message for registered groups
|
||||
const group = this.opts.registeredGroups()[chatJid];
|
||||
if (!group) {
|
||||
logger.debug(
|
||||
{ chatJid, chatName },
|
||||
'Message from unregistered Telegram chat',
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// Deliver message — startMessageLoop() will pick it up
|
||||
this.opts.onMessage(chatJid, {
|
||||
id: msgId,
|
||||
chat_jid: chatJid,
|
||||
sender,
|
||||
sender_name: senderName,
|
||||
content,
|
||||
timestamp,
|
||||
is_from_me: false,
|
||||
});
|
||||
|
||||
logger.info(
|
||||
{ chatJid, chatName, sender: senderName },
|
||||
'Telegram message stored',
|
||||
);
|
||||
});
|
||||
|
||||
// Handle non-text messages with placeholders so the agent knows something was sent
|
||||
const storeNonText = (ctx: any, placeholder: string) => {
|
||||
const chatJid = `tg:${ctx.chat.id}`;
|
||||
const group = this.opts.registeredGroups()[chatJid];
|
||||
if (!group) return;
|
||||
|
||||
const timestamp = new Date(ctx.message.date * 1000).toISOString();
|
||||
const senderName =
|
||||
ctx.from?.first_name ||
|
||||
ctx.from?.username ||
|
||||
ctx.from?.id?.toString() ||
|
||||
'Unknown';
|
||||
const caption = ctx.message.caption ? ` ${ctx.message.caption}` : '';
|
||||
|
||||
this.opts.onChatMetadata(chatJid, timestamp);
|
||||
this.opts.onMessage(chatJid, {
|
||||
id: ctx.message.message_id.toString(),
|
||||
chat_jid: chatJid,
|
||||
sender: ctx.from?.id?.toString() || '',
|
||||
sender_name: senderName,
|
||||
content: `${placeholder}${caption}`,
|
||||
timestamp,
|
||||
is_from_me: false,
|
||||
});
|
||||
};
|
||||
|
||||
this.bot.on('message:photo', (ctx) => storeNonText(ctx, '[Photo]'));
|
||||
this.bot.on('message:video', (ctx) => storeNonText(ctx, '[Video]'));
|
||||
this.bot.on('message:voice', (ctx) =>
|
||||
storeNonText(ctx, '[Voice message]'),
|
||||
);
|
||||
this.bot.on('message:audio', (ctx) => storeNonText(ctx, '[Audio]'));
|
||||
this.bot.on('message:document', (ctx) => {
|
||||
const name = ctx.message.document?.file_name || 'file';
|
||||
storeNonText(ctx, `[Document: ${name}]`);
|
||||
});
|
||||
this.bot.on('message:sticker', (ctx) => {
|
||||
const emoji = ctx.message.sticker?.emoji || '';
|
||||
storeNonText(ctx, `[Sticker ${emoji}]`);
|
||||
});
|
||||
this.bot.on('message:location', (ctx) => storeNonText(ctx, '[Location]'));
|
||||
this.bot.on('message:contact', (ctx) => storeNonText(ctx, '[Contact]'));
|
||||
|
||||
// Handle errors gracefully
|
||||
this.bot.catch((err) => {
|
||||
logger.error({ err: err.message }, 'Telegram bot error');
|
||||
});
|
||||
|
||||
// Start polling — returns a Promise that resolves when started
|
||||
return new Promise<void>((resolve) => {
|
||||
this.bot!.start({
|
||||
onStart: (botInfo) => {
|
||||
logger.info(
|
||||
{ username: botInfo.username, id: botInfo.id },
|
||||
'Telegram bot connected',
|
||||
);
|
||||
console.log(`\n Telegram bot: @${botInfo.username}`);
|
||||
console.log(
|
||||
` Send /chatid to the bot to get a chat's registration ID\n`,
|
||||
);
|
||||
resolve();
|
||||
},
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async sendMessage(jid: string, text: string): Promise<void> {
|
||||
if (!this.bot) {
|
||||
logger.warn('Telegram bot not initialized');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const numericId = jid.replace(/^tg:/, '');
|
||||
|
||||
// Telegram has a 4096 character limit per message — split if needed
|
||||
const MAX_LENGTH = 4096;
|
||||
if (text.length <= MAX_LENGTH) {
|
||||
await this.bot.api.sendMessage(numericId, text);
|
||||
} else {
|
||||
for (let i = 0; i < text.length; i += MAX_LENGTH) {
|
||||
await this.bot.api.sendMessage(
|
||||
numericId,
|
||||
text.slice(i, i + MAX_LENGTH),
|
||||
);
|
||||
}
|
||||
}
|
||||
logger.info({ jid, length: text.length }, 'Telegram message sent');
|
||||
} catch (err) {
|
||||
logger.error({ jid, err }, 'Failed to send Telegram message');
|
||||
}
|
||||
}
|
||||
|
||||
isConnected(): boolean {
|
||||
return this.bot !== null;
|
||||
}
|
||||
|
||||
ownsJid(jid: string): boolean {
|
||||
return jid.startsWith('tg:');
|
||||
}
|
||||
|
||||
async disconnect(): Promise<void> {
|
||||
if (this.bot) {
|
||||
this.bot.stop();
|
||||
this.bot = null;
|
||||
logger.info('Telegram bot stopped');
|
||||
}
|
||||
}
|
||||
|
||||
async setTyping(jid: string, isTyping: boolean): Promise<void> {
|
||||
if (!this.bot || !isTyping) return;
|
||||
try {
|
||||
const numericId = jid.replace(/^tg:/, '');
|
||||
await this.bot.api.sendChatAction(numericId, 'typing');
|
||||
} catch (err) {
|
||||
logger.debug({ jid, err }, 'Failed to send Telegram typing indicator');
|
||||
}
|
||||
}
|
||||
}
|
||||
20
.claude/skills/add-telegram/manifest.yaml
Normal file
20
.claude/skills/add-telegram/manifest.yaml
Normal file
@@ -0,0 +1,20 @@
|
||||
skill: telegram
|
||||
version: 1.0.0
|
||||
description: "Telegram Bot API integration via Grammy"
|
||||
core_version: 0.1.0
|
||||
adds:
|
||||
- src/channels/telegram.ts
|
||||
- src/channels/telegram.test.ts
|
||||
modifies:
|
||||
- src/index.ts
|
||||
- src/config.ts
|
||||
- src/routing.test.ts
|
||||
structured:
|
||||
npm_dependencies:
|
||||
grammy: "^1.39.3"
|
||||
env_additions:
|
||||
- TELEGRAM_BOT_TOKEN
|
||||
- TELEGRAM_ONLY
|
||||
conflicts: []
|
||||
depends: []
|
||||
test: "npx vitest run src/channels/telegram.test.ts"
|
||||
76
.claude/skills/add-telegram/modify/src/config.ts
Normal file
76
.claude/skills/add-telegram/modify/src/config.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import path from 'path';
|
||||
|
||||
import { readEnvFile } from './env.js';
|
||||
|
||||
// Read config values from .env (falls back to process.env).
|
||||
// Secrets are NOT read here — they stay on disk and are loaded only
|
||||
// where needed (container-runner.ts) to avoid leaking to child processes.
|
||||
const envConfig = readEnvFile([
|
||||
'ASSISTANT_NAME',
|
||||
'ASSISTANT_HAS_OWN_NUMBER',
|
||||
'TELEGRAM_BOT_TOKEN',
|
||||
'TELEGRAM_ONLY',
|
||||
]);
|
||||
|
||||
export const ASSISTANT_NAME =
|
||||
process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy';
|
||||
export const ASSISTANT_HAS_OWN_NUMBER =
|
||||
(process.env.ASSISTANT_HAS_OWN_NUMBER || envConfig.ASSISTANT_HAS_OWN_NUMBER) === 'true';
|
||||
export const POLL_INTERVAL = 2000;
|
||||
export const SCHEDULER_POLL_INTERVAL = 60000;
|
||||
|
||||
// Absolute paths needed for container mounts
|
||||
const PROJECT_ROOT = process.cwd();
|
||||
const HOME_DIR = process.env.HOME || '/Users/user';
|
||||
|
||||
// Mount security: allowlist stored OUTSIDE project root, never mounted into containers
|
||||
export const MOUNT_ALLOWLIST_PATH = path.join(
|
||||
HOME_DIR,
|
||||
'.config',
|
||||
'nanoclaw',
|
||||
'mount-allowlist.json',
|
||||
);
|
||||
export const STORE_DIR = path.resolve(PROJECT_ROOT, 'store');
|
||||
export const GROUPS_DIR = path.resolve(PROJECT_ROOT, 'groups');
|
||||
export const DATA_DIR = path.resolve(PROJECT_ROOT, 'data');
|
||||
export const MAIN_GROUP_FOLDER = 'main';
|
||||
|
||||
export const CONTAINER_IMAGE =
|
||||
process.env.CONTAINER_IMAGE || 'nanoclaw-agent:latest';
|
||||
export const CONTAINER_TIMEOUT = parseInt(
|
||||
process.env.CONTAINER_TIMEOUT || '1800000',
|
||||
10,
|
||||
);
|
||||
export const CONTAINER_MAX_OUTPUT_SIZE = parseInt(
|
||||
process.env.CONTAINER_MAX_OUTPUT_SIZE || '10485760',
|
||||
10,
|
||||
); // 10MB default
|
||||
export const IPC_POLL_INTERVAL = 1000;
|
||||
export const IDLE_TIMEOUT = parseInt(
|
||||
process.env.IDLE_TIMEOUT || '1800000',
|
||||
10,
|
||||
); // 30min default — how long to keep container alive after last result
|
||||
export const MAX_CONCURRENT_CONTAINERS = Math.max(
|
||||
1,
|
||||
parseInt(process.env.MAX_CONCURRENT_CONTAINERS || '5', 10) || 5,
|
||||
);
|
||||
|
||||
function escapeRegex(str: string): string {
|
||||
return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
|
||||
}
|
||||
|
||||
export const TRIGGER_PATTERN = new RegExp(
|
||||
`^@${escapeRegex(ASSISTANT_NAME)}\\b`,
|
||||
'i',
|
||||
);
|
||||
|
||||
// Timezone for scheduled tasks (cron expressions, etc.)
|
||||
// Uses system timezone by default
|
||||
export const TIMEZONE =
|
||||
process.env.TZ || Intl.DateTimeFormat().resolvedOptions().timeZone;
|
||||
|
||||
// Telegram configuration
|
||||
export const TELEGRAM_BOT_TOKEN =
|
||||
process.env.TELEGRAM_BOT_TOKEN || envConfig.TELEGRAM_BOT_TOKEN || '';
|
||||
export const TELEGRAM_ONLY =
|
||||
(process.env.TELEGRAM_ONLY || envConfig.TELEGRAM_ONLY) === 'true';
|
||||
21
.claude/skills/add-telegram/modify/src/config.ts.intent.md
Normal file
21
.claude/skills/add-telegram/modify/src/config.ts.intent.md
Normal file
@@ -0,0 +1,21 @@
|
||||
# Intent: src/config.ts modifications
|
||||
|
||||
## What changed
|
||||
Added two new configuration exports for Telegram channel support.
|
||||
|
||||
## Key sections
|
||||
- **readEnvFile call**: Must include `TELEGRAM_BOT_TOKEN` and `TELEGRAM_ONLY` in the keys array. NanoClaw does NOT load `.env` into `process.env` — all `.env` values must be explicitly requested via `readEnvFile()`.
|
||||
- **TELEGRAM_BOT_TOKEN**: Read from `process.env` first, then `envConfig` fallback, defaults to empty string (channel disabled when empty)
|
||||
- **TELEGRAM_ONLY**: Boolean flag from `process.env` or `envConfig`, when `true` disables WhatsApp channel creation
|
||||
|
||||
## Invariants
|
||||
- All existing config exports remain unchanged
|
||||
- New Telegram keys are added to the `readEnvFile` call alongside existing keys
|
||||
- New exports are appended at the end of the file
|
||||
- No existing behavior is modified — Telegram config is additive only
|
||||
- Both `process.env` and `envConfig` are checked (same pattern as `ASSISTANT_NAME`)
|
||||
|
||||
## Must-keep
|
||||
- All existing exports (`ASSISTANT_NAME`, `POLL_INTERVAL`, `TRIGGER_PATTERN`, etc.)
|
||||
- The `readEnvFile` pattern — ALL config read from `.env` must go through this function
|
||||
- The `escapeRegex` helper and `TRIGGER_PATTERN` construction
|
||||
537
.claude/skills/add-telegram/modify/src/index.ts
Normal file
537
.claude/skills/add-telegram/modify/src/index.ts
Normal file
@@ -0,0 +1,537 @@
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import {
|
||||
ASSISTANT_NAME,
|
||||
DATA_DIR,
|
||||
IDLE_TIMEOUT,
|
||||
MAIN_GROUP_FOLDER,
|
||||
POLL_INTERVAL,
|
||||
TELEGRAM_BOT_TOKEN,
|
||||
TELEGRAM_ONLY,
|
||||
TRIGGER_PATTERN,
|
||||
} from './config.js';
|
||||
import { WhatsAppChannel } from './channels/whatsapp.js';
|
||||
import { TelegramChannel } from './channels/telegram.js';
|
||||
import {
|
||||
ContainerOutput,
|
||||
runContainerAgent,
|
||||
writeGroupsSnapshot,
|
||||
writeTasksSnapshot,
|
||||
} from './container-runner.js';
|
||||
import {
|
||||
getAllChats,
|
||||
getAllRegisteredGroups,
|
||||
getAllSessions,
|
||||
getAllTasks,
|
||||
getMessagesSince,
|
||||
getNewMessages,
|
||||
getRouterState,
|
||||
initDatabase,
|
||||
setRegisteredGroup,
|
||||
setRouterState,
|
||||
setSession,
|
||||
storeChatMetadata,
|
||||
storeMessage,
|
||||
} from './db.js';
|
||||
import { GroupQueue } from './group-queue.js';
|
||||
import { startIpcWatcher } from './ipc.js';
|
||||
import { findChannel, formatMessages, formatOutbound } from './router.js';
|
||||
import { startSchedulerLoop } from './task-scheduler.js';
|
||||
import { Channel, NewMessage, RegisteredGroup } from './types.js';
|
||||
import { logger } from './logger.js';
|
||||
|
||||
// Re-export for backwards compatibility during refactor
|
||||
export { escapeXml, formatMessages } from './router.js';
|
||||
|
||||
let lastTimestamp = '';
|
||||
let sessions: Record<string, string> = {};
|
||||
let registeredGroups: Record<string, RegisteredGroup> = {};
|
||||
let lastAgentTimestamp: Record<string, string> = {};
|
||||
let messageLoopRunning = false;
|
||||
|
||||
let whatsapp: WhatsAppChannel;
|
||||
const channels: Channel[] = [];
|
||||
const queue = new GroupQueue();
|
||||
|
||||
function loadState(): void {
|
||||
lastTimestamp = getRouterState('last_timestamp') || '';
|
||||
const agentTs = getRouterState('last_agent_timestamp');
|
||||
try {
|
||||
lastAgentTimestamp = agentTs ? JSON.parse(agentTs) : {};
|
||||
} catch {
|
||||
logger.warn('Corrupted last_agent_timestamp in DB, resetting');
|
||||
lastAgentTimestamp = {};
|
||||
}
|
||||
sessions = getAllSessions();
|
||||
registeredGroups = getAllRegisteredGroups();
|
||||
logger.info(
|
||||
{ groupCount: Object.keys(registeredGroups).length },
|
||||
'State loaded',
|
||||
);
|
||||
}
|
||||
|
||||
function saveState(): void {
|
||||
setRouterState('last_timestamp', lastTimestamp);
|
||||
setRouterState(
|
||||
'last_agent_timestamp',
|
||||
JSON.stringify(lastAgentTimestamp),
|
||||
);
|
||||
}
|
||||
|
||||
function registerGroup(jid: string, group: RegisteredGroup): void {
|
||||
registeredGroups[jid] = group;
|
||||
setRegisteredGroup(jid, group);
|
||||
|
||||
// Create group folder
|
||||
const groupDir = path.join(DATA_DIR, '..', 'groups', group.folder);
|
||||
fs.mkdirSync(path.join(groupDir, 'logs'), { recursive: true });
|
||||
|
||||
logger.info(
|
||||
{ jid, name: group.name, folder: group.folder },
|
||||
'Group registered',
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get available groups list for the agent.
|
||||
* Returns groups ordered by most recent activity.
|
||||
*/
|
||||
export function getAvailableGroups(): import('./container-runner.js').AvailableGroup[] {
|
||||
const chats = getAllChats();
|
||||
const registeredJids = new Set(Object.keys(registeredGroups));
|
||||
|
||||
return chats
|
||||
.filter((c) => c.jid !== '__group_sync__' && c.is_group)
|
||||
.map((c) => ({
|
||||
jid: c.jid,
|
||||
name: c.name,
|
||||
lastActivity: c.last_message_time,
|
||||
isRegistered: registeredJids.has(c.jid),
|
||||
}));
|
||||
}
|
||||
|
||||
/** @internal - exported for testing */
|
||||
export function _setRegisteredGroups(groups: Record<string, RegisteredGroup>): void {
|
||||
registeredGroups = groups;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process all pending messages for a group.
|
||||
* Called by the GroupQueue when it's this group's turn.
|
||||
*/
|
||||
async function processGroupMessages(chatJid: string): Promise<boolean> {
|
||||
const group = registeredGroups[chatJid];
|
||||
if (!group) return true;
|
||||
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) return true;
|
||||
|
||||
const isMainGroup = group.folder === MAIN_GROUP_FOLDER;
|
||||
|
||||
const sinceTimestamp = lastAgentTimestamp[chatJid] || '';
|
||||
const missedMessages = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME);
|
||||
|
||||
if (missedMessages.length === 0) return true;
|
||||
|
||||
// For non-main groups, check if trigger is required and present
|
||||
if (!isMainGroup && group.requiresTrigger !== false) {
|
||||
const hasTrigger = missedMessages.some((m) =>
|
||||
TRIGGER_PATTERN.test(m.content.trim()),
|
||||
);
|
||||
if (!hasTrigger) return true;
|
||||
}
|
||||
|
||||
const prompt = formatMessages(missedMessages);
|
||||
|
||||
// Advance cursor so the piping path in startMessageLoop won't re-fetch
|
||||
// these messages. Save the old cursor so we can roll back on error.
|
||||
const previousCursor = lastAgentTimestamp[chatJid] || '';
|
||||
lastAgentTimestamp[chatJid] =
|
||||
missedMessages[missedMessages.length - 1].timestamp;
|
||||
saveState();
|
||||
|
||||
logger.info(
|
||||
{ group: group.name, messageCount: missedMessages.length },
|
||||
'Processing messages',
|
||||
);
|
||||
|
||||
// Track idle timer for closing stdin when agent is idle
|
||||
let idleTimer: ReturnType<typeof setTimeout> | null = null;
|
||||
|
||||
const resetIdleTimer = () => {
|
||||
if (idleTimer) clearTimeout(idleTimer);
|
||||
idleTimer = setTimeout(() => {
|
||||
logger.debug({ group: group.name }, 'Idle timeout, closing container stdin');
|
||||
queue.closeStdin(chatJid);
|
||||
}, IDLE_TIMEOUT);
|
||||
};
|
||||
|
||||
await channel.setTyping?.(chatJid, true);
|
||||
let hadError = false;
|
||||
let outputSentToUser = false;
|
||||
|
||||
const output = await runAgent(group, prompt, chatJid, async (result) => {
|
||||
// Streaming output callback — called for each agent result
|
||||
if (result.result) {
|
||||
const raw = typeof result.result === 'string' ? result.result : JSON.stringify(result.result);
|
||||
// Strip <internal>...</internal> blocks — agent uses these for internal reasoning
|
||||
const text = raw.replace(/<internal>[\s\S]*?<\/internal>/g, '').trim();
|
||||
logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`);
|
||||
if (text) {
|
||||
await channel.sendMessage(chatJid, text);
|
||||
outputSentToUser = true;
|
||||
}
|
||||
// Only reset idle timer on actual results, not session-update markers (result: null)
|
||||
resetIdleTimer();
|
||||
}
|
||||
|
||||
if (result.status === 'error') {
|
||||
hadError = true;
|
||||
}
|
||||
});
|
||||
|
||||
await channel.setTyping?.(chatJid, false);
|
||||
if (idleTimer) clearTimeout(idleTimer);
|
||||
|
||||
if (output === 'error' || hadError) {
|
||||
// If we already sent output to the user, don't roll back the cursor —
|
||||
// the user got their response and re-processing would send duplicates.
|
||||
if (outputSentToUser) {
|
||||
logger.warn({ group: group.name }, 'Agent error after output was sent, skipping cursor rollback to prevent duplicates');
|
||||
return true;
|
||||
}
|
||||
// Roll back cursor so retries can re-process these messages
|
||||
lastAgentTimestamp[chatJid] = previousCursor;
|
||||
saveState();
|
||||
logger.warn({ group: group.name }, 'Agent error, rolled back message cursor for retry');
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async function runAgent(
|
||||
group: RegisteredGroup,
|
||||
prompt: string,
|
||||
chatJid: string,
|
||||
onOutput?: (output: ContainerOutput) => Promise<void>,
|
||||
): Promise<'success' | 'error'> {
|
||||
const isMain = group.folder === MAIN_GROUP_FOLDER;
|
||||
const sessionId = sessions[group.folder];
|
||||
|
||||
// Update tasks snapshot for container to read (filtered by group)
|
||||
const tasks = getAllTasks();
|
||||
writeTasksSnapshot(
|
||||
group.folder,
|
||||
isMain,
|
||||
tasks.map((t) => ({
|
||||
id: t.id,
|
||||
groupFolder: t.group_folder,
|
||||
prompt: t.prompt,
|
||||
schedule_type: t.schedule_type,
|
||||
schedule_value: t.schedule_value,
|
||||
status: t.status,
|
||||
next_run: t.next_run,
|
||||
})),
|
||||
);
|
||||
|
||||
// Update available groups snapshot (main group only can see all groups)
|
||||
const availableGroups = getAvailableGroups();
|
||||
writeGroupsSnapshot(
|
||||
group.folder,
|
||||
isMain,
|
||||
availableGroups,
|
||||
new Set(Object.keys(registeredGroups)),
|
||||
);
|
||||
|
||||
// Wrap onOutput to track session ID from streamed results
|
||||
const wrappedOnOutput = onOutput
|
||||
? async (output: ContainerOutput) => {
|
||||
if (output.newSessionId) {
|
||||
sessions[group.folder] = output.newSessionId;
|
||||
setSession(group.folder, output.newSessionId);
|
||||
}
|
||||
await onOutput(output);
|
||||
}
|
||||
: undefined;
|
||||
|
||||
try {
|
||||
const output = await runContainerAgent(
|
||||
group,
|
||||
{
|
||||
prompt,
|
||||
sessionId,
|
||||
groupFolder: group.folder,
|
||||
chatJid,
|
||||
isMain,
|
||||
},
|
||||
(proc, containerName) => queue.registerProcess(chatJid, proc, containerName, group.folder),
|
||||
wrappedOnOutput,
|
||||
);
|
||||
|
||||
if (output.newSessionId) {
|
||||
sessions[group.folder] = output.newSessionId;
|
||||
setSession(group.folder, output.newSessionId);
|
||||
}
|
||||
|
||||
if (output.status === 'error') {
|
||||
logger.error(
|
||||
{ group: group.name, error: output.error },
|
||||
'Container agent error',
|
||||
);
|
||||
return 'error';
|
||||
}
|
||||
|
||||
return 'success';
|
||||
} catch (err) {
|
||||
logger.error({ group: group.name, err }, 'Agent error');
|
||||
return 'error';
|
||||
}
|
||||
}
|
||||
|
||||
async function startMessageLoop(): Promise<void> {
|
||||
if (messageLoopRunning) {
|
||||
logger.debug('Message loop already running, skipping duplicate start');
|
||||
return;
|
||||
}
|
||||
messageLoopRunning = true;
|
||||
|
||||
logger.info(`NanoClaw running (trigger: @${ASSISTANT_NAME})`);
|
||||
|
||||
while (true) {
|
||||
try {
|
||||
const jids = Object.keys(registeredGroups);
|
||||
const { messages, newTimestamp } = getNewMessages(jids, lastTimestamp, ASSISTANT_NAME);
|
||||
|
||||
if (messages.length > 0) {
|
||||
logger.info({ count: messages.length }, 'New messages');
|
||||
|
||||
// Advance the "seen" cursor for all messages immediately
|
||||
lastTimestamp = newTimestamp;
|
||||
saveState();
|
||||
|
||||
// Deduplicate by group
|
||||
const messagesByGroup = new Map<string, NewMessage[]>();
|
||||
for (const msg of messages) {
|
||||
const existing = messagesByGroup.get(msg.chat_jid);
|
||||
if (existing) {
|
||||
existing.push(msg);
|
||||
} else {
|
||||
messagesByGroup.set(msg.chat_jid, [msg]);
|
||||
}
|
||||
}
|
||||
|
||||
for (const [chatJid, groupMessages] of messagesByGroup) {
|
||||
const group = registeredGroups[chatJid];
|
||||
if (!group) continue;
|
||||
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) continue;
|
||||
|
||||
const isMainGroup = group.folder === MAIN_GROUP_FOLDER;
|
||||
const needsTrigger = !isMainGroup && group.requiresTrigger !== false;
|
||||
|
||||
// For non-main groups, only act on trigger messages.
|
||||
// Non-trigger messages accumulate in DB and get pulled as
|
||||
// context when a trigger eventually arrives.
|
||||
if (needsTrigger) {
|
||||
const hasTrigger = groupMessages.some((m) =>
|
||||
TRIGGER_PATTERN.test(m.content.trim()),
|
||||
);
|
||||
if (!hasTrigger) continue;
|
||||
}
|
||||
|
||||
// Pull all messages since lastAgentTimestamp so non-trigger
|
||||
// context that accumulated between triggers is included.
|
||||
const allPending = getMessagesSince(
|
||||
chatJid,
|
||||
lastAgentTimestamp[chatJid] || '',
|
||||
ASSISTANT_NAME,
|
||||
);
|
||||
const messagesToSend =
|
||||
allPending.length > 0 ? allPending : groupMessages;
|
||||
const formatted = formatMessages(messagesToSend);
|
||||
|
||||
if (queue.sendMessage(chatJid, formatted)) {
|
||||
logger.debug(
|
||||
{ chatJid, count: messagesToSend.length },
|
||||
'Piped messages to active container',
|
||||
);
|
||||
lastAgentTimestamp[chatJid] =
|
||||
messagesToSend[messagesToSend.length - 1].timestamp;
|
||||
saveState();
|
||||
// Show typing indicator while the container processes the piped message
|
||||
channel.setTyping?.(chatJid, true);
|
||||
} else {
|
||||
// No active container — enqueue for a new one
|
||||
queue.enqueueMessageCheck(chatJid);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
logger.error({ err }, 'Error in message loop');
|
||||
}
|
||||
await new Promise((resolve) => setTimeout(resolve, POLL_INTERVAL));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Startup recovery: check for unprocessed messages in registered groups.
|
||||
* Handles crash between advancing lastTimestamp and processing messages.
|
||||
*/
|
||||
function recoverPendingMessages(): void {
|
||||
for (const [chatJid, group] of Object.entries(registeredGroups)) {
|
||||
const sinceTimestamp = lastAgentTimestamp[chatJid] || '';
|
||||
const pending = getMessagesSince(chatJid, sinceTimestamp, ASSISTANT_NAME);
|
||||
if (pending.length > 0) {
|
||||
logger.info(
|
||||
{ group: group.name, pendingCount: pending.length },
|
||||
'Recovery: found unprocessed messages',
|
||||
);
|
||||
queue.enqueueMessageCheck(chatJid);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function ensureContainerSystemRunning(): void {
|
||||
try {
|
||||
execSync('container system status', { stdio: 'pipe' });
|
||||
logger.debug('Apple Container system already running');
|
||||
} catch {
|
||||
logger.info('Starting Apple Container system...');
|
||||
try {
|
||||
execSync('container system start', { stdio: 'pipe', timeout: 30000 });
|
||||
logger.info('Apple Container system started');
|
||||
} catch (err) {
|
||||
logger.error({ err }, 'Failed to start Apple Container system');
|
||||
console.error(
|
||||
'\n╔════════════════════════════════════════════════════════════════╗',
|
||||
);
|
||||
console.error(
|
||||
'║ FATAL: Apple Container system failed to start ║',
|
||||
);
|
||||
console.error(
|
||||
'║ ║',
|
||||
);
|
||||
console.error(
|
||||
'║ Agents cannot run without Apple Container. To fix: ║',
|
||||
);
|
||||
console.error(
|
||||
'║ 1. Install from: https://github.com/apple/container/releases ║',
|
||||
);
|
||||
console.error(
|
||||
'║ 2. Run: container system start ║',
|
||||
);
|
||||
console.error(
|
||||
'║ 3. Restart NanoClaw ║',
|
||||
);
|
||||
console.error(
|
||||
'╚════════════════════════════════════════════════════════════════╝\n',
|
||||
);
|
||||
throw new Error('Apple Container system is required but failed to start');
|
||||
}
|
||||
}
|
||||
|
||||
// Kill and clean up orphaned NanoClaw containers from previous runs
|
||||
try {
|
||||
const output = execSync('container ls --format json', {
|
||||
stdio: ['pipe', 'pipe', 'pipe'],
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
const containers: { status: string; configuration: { id: string } }[] = JSON.parse(output || '[]');
|
||||
const orphans = containers
|
||||
.filter((c) => c.status === 'running' && c.configuration.id.startsWith('nanoclaw-'))
|
||||
.map((c) => c.configuration.id);
|
||||
for (const name of orphans) {
|
||||
try {
|
||||
execSync(`container stop ${name}`, { stdio: 'pipe' });
|
||||
} catch { /* already stopped */ }
|
||||
}
|
||||
if (orphans.length > 0) {
|
||||
logger.info({ count: orphans.length, names: orphans }, 'Stopped orphaned containers');
|
||||
}
|
||||
} catch (err) {
|
||||
logger.warn({ err }, 'Failed to clean up orphaned containers');
|
||||
}
|
||||
}
|
||||
|
||||
async function main(): Promise<void> {
|
||||
ensureContainerSystemRunning();
|
||||
initDatabase();
|
||||
logger.info('Database initialized');
|
||||
loadState();
|
||||
|
||||
// Graceful shutdown handlers
|
||||
const shutdown = async (signal: string) => {
|
||||
logger.info({ signal }, 'Shutdown signal received');
|
||||
await queue.shutdown(10000);
|
||||
for (const ch of channels) await ch.disconnect();
|
||||
process.exit(0);
|
||||
};
|
||||
process.on('SIGTERM', () => shutdown('SIGTERM'));
|
||||
process.on('SIGINT', () => shutdown('SIGINT'));
|
||||
|
||||
// Channel callbacks (shared by all channels)
|
||||
const channelOpts = {
|
||||
onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg),
|
||||
onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) =>
|
||||
storeChatMetadata(chatJid, timestamp, name, channel, isGroup),
|
||||
registeredGroups: () => registeredGroups,
|
||||
};
|
||||
|
||||
// Create and connect channels
|
||||
if (!TELEGRAM_ONLY) {
|
||||
whatsapp = new WhatsAppChannel(channelOpts);
|
||||
channels.push(whatsapp);
|
||||
await whatsapp.connect();
|
||||
}
|
||||
|
||||
if (TELEGRAM_BOT_TOKEN) {
|
||||
const telegram = new TelegramChannel(TELEGRAM_BOT_TOKEN, channelOpts);
|
||||
channels.push(telegram);
|
||||
await telegram.connect();
|
||||
}
|
||||
|
||||
// Start subsystems (independently of connection handler)
|
||||
startSchedulerLoop({
|
||||
registeredGroups: () => registeredGroups,
|
||||
getSessions: () => sessions,
|
||||
queue,
|
||||
onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder),
|
||||
sendMessage: async (jid, rawText) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) return;
|
||||
const text = formatOutbound(rawText);
|
||||
if (text) await channel.sendMessage(jid, text);
|
||||
},
|
||||
});
|
||||
startIpcWatcher({
|
||||
sendMessage: (jid, text) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) throw new Error(`No channel for JID: ${jid}`);
|
||||
return channel.sendMessage(jid, text);
|
||||
},
|
||||
registeredGroups: () => registeredGroups,
|
||||
registerGroup,
|
||||
syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(),
|
||||
getAvailableGroups,
|
||||
writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj),
|
||||
});
|
||||
queue.setProcessMessagesFn(processGroupMessages);
|
||||
recoverPendingMessages();
|
||||
startMessageLoop();
|
||||
}
|
||||
|
||||
// Guard: only run when executed directly, not when imported by tests
|
||||
const isDirectRun =
|
||||
process.argv[1] &&
|
||||
new URL(import.meta.url).pathname === new URL(`file://${process.argv[1]}`).pathname;
|
||||
|
||||
if (isDirectRun) {
|
||||
main().catch((err) => {
|
||||
logger.error({ err }, 'Failed to start NanoClaw');
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
50
.claude/skills/add-telegram/modify/src/index.ts.intent.md
Normal file
50
.claude/skills/add-telegram/modify/src/index.ts.intent.md
Normal file
@@ -0,0 +1,50 @@
|
||||
# Intent: src/index.ts modifications
|
||||
|
||||
## What changed
|
||||
Refactored from single WhatsApp channel to multi-channel architecture using the `Channel` interface.
|
||||
|
||||
## Key sections
|
||||
|
||||
### Imports (top of file)
|
||||
- Added: `TelegramChannel` from `./channels/telegram.js`
|
||||
- Added: `TELEGRAM_BOT_TOKEN`, `TELEGRAM_ONLY` from `./config.js`
|
||||
- Added: `findChannel` from `./router.js`
|
||||
- Added: `Channel` type from `./types.js`
|
||||
|
||||
### Module-level state
|
||||
- Added: `const channels: Channel[] = []` — array of all active channels
|
||||
- Kept: `let whatsapp: WhatsAppChannel` — still needed for `syncGroupMetadata` reference
|
||||
|
||||
### processGroupMessages()
|
||||
- Added: `findChannel(channels, chatJid)` lookup at the start
|
||||
- Changed: `whatsapp.setTyping()` → `channel.setTyping?.()` (optional chaining)
|
||||
- Changed: `whatsapp.sendMessage()` → `channel.sendMessage()` in output callback
|
||||
|
||||
### getAvailableGroups()
|
||||
- Unchanged: uses `c.is_group` filter from base (Telegram channels pass `isGroup=true` via `onChatMetadata`)
|
||||
|
||||
### startMessageLoop()
|
||||
- Added: `findChannel(channels, chatJid)` lookup per group in message processing
|
||||
- Changed: `whatsapp.setTyping()` → `channel.setTyping?.()` for typing indicators
|
||||
|
||||
### main()
|
||||
- Changed: shutdown disconnects all channels via `for (const ch of channels)`
|
||||
- Added: shared `channelOpts` object for channel callbacks
|
||||
- Added: conditional WhatsApp creation (`if (!TELEGRAM_ONLY)`)
|
||||
- Added: conditional Telegram creation (`if (TELEGRAM_BOT_TOKEN)`)
|
||||
- Changed: scheduler `sendMessage` uses `findChannel()` → `channel.sendMessage()`
|
||||
- Changed: IPC `sendMessage` uses `findChannel()` → `channel.sendMessage()`
|
||||
|
||||
## Invariants
|
||||
- All existing message processing logic (triggers, cursors, idle timers) is preserved
|
||||
- The `runAgent` function is completely unchanged
|
||||
- State management (loadState/saveState) is unchanged
|
||||
- Recovery logic is unchanged
|
||||
- Apple Container check is unchanged (ensureContainerSystemRunning)
|
||||
|
||||
## Must-keep
|
||||
- The `escapeXml` and `formatMessages` re-exports
|
||||
- The `_setRegisteredGroups` test helper
|
||||
- The `isDirectRun` guard at bottom
|
||||
- All error handling and cursor rollback logic in processGroupMessages
|
||||
- The outgoing queue flush and reconnection logic (in WhatsAppChannel, not here)
|
||||
161
.claude/skills/add-telegram/modify/src/routing.test.ts
Normal file
161
.claude/skills/add-telegram/modify/src/routing.test.ts
Normal file
@@ -0,0 +1,161 @@
|
||||
import { describe, it, expect, beforeEach } from 'vitest';
|
||||
|
||||
import { _initTestDatabase, getAllChats, storeChatMetadata } from './db.js';
|
||||
import { getAvailableGroups, _setRegisteredGroups } from './index.js';
|
||||
|
||||
beforeEach(() => {
|
||||
_initTestDatabase();
|
||||
_setRegisteredGroups({});
|
||||
});
|
||||
|
||||
// --- JID ownership patterns ---
|
||||
|
||||
describe('JID ownership patterns', () => {
|
||||
// These test the patterns that will become ownsJid() on the Channel interface
|
||||
|
||||
it('WhatsApp group JID: ends with @g.us', () => {
|
||||
const jid = '12345678@g.us';
|
||||
expect(jid.endsWith('@g.us')).toBe(true);
|
||||
});
|
||||
|
||||
it('WhatsApp DM JID: ends with @s.whatsapp.net', () => {
|
||||
const jid = '12345678@s.whatsapp.net';
|
||||
expect(jid.endsWith('@s.whatsapp.net')).toBe(true);
|
||||
});
|
||||
|
||||
it('Telegram JID: starts with tg:', () => {
|
||||
const jid = 'tg:123456789';
|
||||
expect(jid.startsWith('tg:')).toBe(true);
|
||||
});
|
||||
|
||||
it('Telegram group JID: starts with tg: and has negative ID', () => {
|
||||
const jid = 'tg:-1001234567890';
|
||||
expect(jid.startsWith('tg:')).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
// --- getAvailableGroups ---
|
||||
|
||||
describe('getAvailableGroups', () => {
|
||||
it('returns only groups, excludes DMs', () => {
|
||||
storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true);
|
||||
storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false);
|
||||
storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(2);
|
||||
expect(groups.map((g) => g.jid)).toContain('group1@g.us');
|
||||
expect(groups.map((g) => g.jid)).toContain('group2@g.us');
|
||||
expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net');
|
||||
});
|
||||
|
||||
it('excludes __group_sync__ sentinel', () => {
|
||||
storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z');
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('group@g.us');
|
||||
});
|
||||
|
||||
it('marks registered groups correctly', () => {
|
||||
storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true);
|
||||
storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true);
|
||||
|
||||
_setRegisteredGroups({
|
||||
'reg@g.us': {
|
||||
name: 'Registered',
|
||||
folder: 'registered',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
});
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
const reg = groups.find((g) => g.jid === 'reg@g.us');
|
||||
const unreg = groups.find((g) => g.jid === 'unreg@g.us');
|
||||
|
||||
expect(reg?.isRegistered).toBe(true);
|
||||
expect(unreg?.isRegistered).toBe(false);
|
||||
});
|
||||
|
||||
it('returns groups ordered by most recent activity', () => {
|
||||
storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true);
|
||||
storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true);
|
||||
storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups[0].jid).toBe('new@g.us');
|
||||
expect(groups[1].jid).toBe('mid@g.us');
|
||||
expect(groups[2].jid).toBe('old@g.us');
|
||||
});
|
||||
|
||||
it('excludes non-group chats regardless of JID format', () => {
|
||||
// Unknown JID format stored without is_group should not appear
|
||||
storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown');
|
||||
// Explicitly non-group with unusual JID
|
||||
storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false);
|
||||
// A real group for contrast
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('group@g.us');
|
||||
});
|
||||
|
||||
it('returns empty array when no chats exist', () => {
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('includes Telegram chat JIDs', () => {
|
||||
storeChatMetadata('tg:100200300', '2024-01-01T00:00:01.000Z', 'Telegram Chat', 'telegram', true);
|
||||
storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('tg:100200300');
|
||||
});
|
||||
|
||||
it('returns Telegram group JIDs with negative IDs', () => {
|
||||
storeChatMetadata('tg:-1001234567890', '2024-01-01T00:00:01.000Z', 'TG Group', 'telegram', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('tg:-1001234567890');
|
||||
expect(groups[0].name).toBe('TG Group');
|
||||
});
|
||||
|
||||
it('marks registered Telegram chats correctly', () => {
|
||||
storeChatMetadata('tg:100200300', '2024-01-01T00:00:01.000Z', 'TG Registered', 'telegram', true);
|
||||
storeChatMetadata('tg:999999', '2024-01-01T00:00:02.000Z', 'TG Unregistered', 'telegram', true);
|
||||
|
||||
_setRegisteredGroups({
|
||||
'tg:100200300': {
|
||||
name: 'TG Registered',
|
||||
folder: 'tg-registered',
|
||||
trigger: '@Andy',
|
||||
added_at: '2024-01-01T00:00:00.000Z',
|
||||
},
|
||||
});
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
const tgReg = groups.find((g) => g.jid === 'tg:100200300');
|
||||
const tgUnreg = groups.find((g) => g.jid === 'tg:999999');
|
||||
|
||||
expect(tgReg?.isRegistered).toBe(true);
|
||||
expect(tgUnreg?.isRegistered).toBe(false);
|
||||
});
|
||||
|
||||
it('mixes WhatsApp and Telegram chats ordered by activity', () => {
|
||||
storeChatMetadata('wa@g.us', '2024-01-01T00:00:01.000Z', 'WhatsApp', 'whatsapp', true);
|
||||
storeChatMetadata('tg:100', '2024-01-01T00:00:03.000Z', 'Telegram', 'telegram', true);
|
||||
storeChatMetadata('wa2@g.us', '2024-01-01T00:00:02.000Z', 'WhatsApp 2', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(3);
|
||||
expect(groups[0].jid).toBe('tg:100');
|
||||
expect(groups[1].jid).toBe('wa2@g.us');
|
||||
expect(groups[2].jid).toBe('wa@g.us');
|
||||
});
|
||||
});
|
||||
118
.claude/skills/add-telegram/tests/telegram.test.ts
Normal file
118
.claude/skills/add-telegram/tests/telegram.test.ts
Normal file
@@ -0,0 +1,118 @@
|
||||
import { describe, expect, it } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
describe('telegram skill package', () => {
|
||||
const skillDir = path.resolve(__dirname, '..');
|
||||
|
||||
it('has a valid manifest', () => {
|
||||
const manifestPath = path.join(skillDir, 'manifest.yaml');
|
||||
expect(fs.existsSync(manifestPath)).toBe(true);
|
||||
|
||||
const content = fs.readFileSync(manifestPath, 'utf-8');
|
||||
expect(content).toContain('skill: telegram');
|
||||
expect(content).toContain('version: 1.0.0');
|
||||
expect(content).toContain('grammy');
|
||||
});
|
||||
|
||||
it('has all files declared in adds', () => {
|
||||
const addFile = path.join(skillDir, 'add', 'src', 'channels', 'telegram.ts');
|
||||
expect(fs.existsSync(addFile)).toBe(true);
|
||||
|
||||
const content = fs.readFileSync(addFile, 'utf-8');
|
||||
expect(content).toContain('class TelegramChannel');
|
||||
expect(content).toContain('implements Channel');
|
||||
|
||||
// Test file for the channel
|
||||
const testFile = path.join(skillDir, 'add', 'src', 'channels', 'telegram.test.ts');
|
||||
expect(fs.existsSync(testFile)).toBe(true);
|
||||
|
||||
const testContent = fs.readFileSync(testFile, 'utf-8');
|
||||
expect(testContent).toContain("describe('TelegramChannel'");
|
||||
});
|
||||
|
||||
it('has all files declared in modifies', () => {
|
||||
const indexFile = path.join(skillDir, 'modify', 'src', 'index.ts');
|
||||
const configFile = path.join(skillDir, 'modify', 'src', 'config.ts');
|
||||
const routingTestFile = path.join(skillDir, 'modify', 'src', 'routing.test.ts');
|
||||
|
||||
expect(fs.existsSync(indexFile)).toBe(true);
|
||||
expect(fs.existsSync(configFile)).toBe(true);
|
||||
expect(fs.existsSync(routingTestFile)).toBe(true);
|
||||
|
||||
const indexContent = fs.readFileSync(indexFile, 'utf-8');
|
||||
expect(indexContent).toContain('TelegramChannel');
|
||||
expect(indexContent).toContain('TELEGRAM_BOT_TOKEN');
|
||||
expect(indexContent).toContain('TELEGRAM_ONLY');
|
||||
expect(indexContent).toContain('findChannel');
|
||||
expect(indexContent).toContain('channels: Channel[]');
|
||||
|
||||
const configContent = fs.readFileSync(configFile, 'utf-8');
|
||||
expect(configContent).toContain('TELEGRAM_BOT_TOKEN');
|
||||
expect(configContent).toContain('TELEGRAM_ONLY');
|
||||
});
|
||||
|
||||
it('has intent files for modified files', () => {
|
||||
expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'index.ts.intent.md'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(skillDir, 'modify', 'src', 'config.ts.intent.md'))).toBe(true);
|
||||
});
|
||||
|
||||
it('modified index.ts preserves core structure', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// Core functions still present
|
||||
expect(content).toContain('function loadState()');
|
||||
expect(content).toContain('function saveState()');
|
||||
expect(content).toContain('function registerGroup(');
|
||||
expect(content).toContain('function getAvailableGroups()');
|
||||
expect(content).toContain('function processGroupMessages(');
|
||||
expect(content).toContain('function runAgent(');
|
||||
expect(content).toContain('function startMessageLoop()');
|
||||
expect(content).toContain('function recoverPendingMessages()');
|
||||
expect(content).toContain('function ensureContainerSystemRunning()');
|
||||
expect(content).toContain('async function main()');
|
||||
|
||||
// Test helper preserved
|
||||
expect(content).toContain('_setRegisteredGroups');
|
||||
|
||||
// Direct-run guard preserved
|
||||
expect(content).toContain('isDirectRun');
|
||||
});
|
||||
|
||||
it('modified index.ts includes Telegram channel creation', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// Multi-channel architecture
|
||||
expect(content).toContain('const channels: Channel[] = []');
|
||||
expect(content).toContain('channels.push(whatsapp)');
|
||||
expect(content).toContain('channels.push(telegram)');
|
||||
|
||||
// Conditional channel creation
|
||||
expect(content).toContain('if (!TELEGRAM_ONLY)');
|
||||
expect(content).toContain('if (TELEGRAM_BOT_TOKEN)');
|
||||
|
||||
// Shutdown disconnects all channels
|
||||
expect(content).toContain('for (const ch of channels) await ch.disconnect()');
|
||||
});
|
||||
|
||||
it('modified config.ts preserves all existing exports', () => {
|
||||
const content = fs.readFileSync(
|
||||
path.join(skillDir, 'modify', 'src', 'config.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// All original exports preserved
|
||||
expect(content).toContain('export const ASSISTANT_NAME');
|
||||
expect(content).toContain('export const POLL_INTERVAL');
|
||||
expect(content).toContain('export const TRIGGER_PATTERN');
|
||||
expect(content).toContain('export const CONTAINER_IMAGE');
|
||||
expect(content).toContain('export const DATA_DIR');
|
||||
expect(content).toContain('export const TIMEZONE');
|
||||
});
|
||||
});
|
||||
1
.env.example
Normal file
1
.env.example
Normal file
@@ -0,0 +1 @@
|
||||
|
||||
84
.github/workflows/skill-tests.yml
vendored
Normal file
84
.github/workflows/skill-tests.yml
vendored
Normal file
@@ -0,0 +1,84 @@
|
||||
name: Skill Combination Tests
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches: [main]
|
||||
paths:
|
||||
- 'skills-engine/**'
|
||||
- '.claude/skills/**'
|
||||
- 'src/**'
|
||||
|
||||
jobs:
|
||||
generate-matrix:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
matrix: ${{ steps.matrix.outputs.matrix }}
|
||||
has_entries: ${{ steps.matrix.outputs.has_entries }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 20
|
||||
cache: npm
|
||||
- run: npm ci
|
||||
- name: Generate overlap matrix
|
||||
id: matrix
|
||||
run: |
|
||||
MATRIX=$(npx tsx scripts/generate-ci-matrix.ts)
|
||||
{
|
||||
echo "matrix<<MATRIX_EOF"
|
||||
echo "$MATRIX"
|
||||
echo "MATRIX_EOF"
|
||||
} >> "$GITHUB_OUTPUT"
|
||||
if [ "$MATRIX" = "[]" ]; then
|
||||
echo "has_entries=false" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "has_entries=true" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
test-combinations:
|
||||
needs: generate-matrix
|
||||
if: needs.generate-matrix.outputs.has_entries == 'true'
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
entry: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 20
|
||||
cache: npm
|
||||
- run: npm ci
|
||||
|
||||
- name: Initialize nanoclaw dir
|
||||
run: npx tsx -e "import { initNanoclawDir } from './skills-engine/index.js'; initNanoclawDir();"
|
||||
|
||||
- name: Apply skills in sequence
|
||||
run: |
|
||||
for skill in $(echo '${{ toJson(matrix.entry.skills) }}' | jq -r '.[]'); do
|
||||
echo "Applying skill: $skill"
|
||||
npx tsx scripts/apply-skill.ts ".claude/skills/$skill"
|
||||
done
|
||||
|
||||
- name: Run skill tests
|
||||
run: npx vitest run --config vitest.skills.config.ts
|
||||
|
||||
skill-tests-summary:
|
||||
needs: [generate-matrix, test-combinations]
|
||||
if: always()
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Report result
|
||||
run: |
|
||||
if [ "${{ needs.generate-matrix.outputs.has_entries }}" = "false" ]; then
|
||||
echo "No overlapping skills found. Skipped combination tests."
|
||||
exit 0
|
||||
fi
|
||||
if [ "${{ needs.test-combinations.result }}" = "success" ]; then
|
||||
echo "All skill combination tests passed."
|
||||
else
|
||||
echo "Some skill combination tests failed."
|
||||
exit 1
|
||||
fi
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -29,4 +29,7 @@ groups/global/*
|
||||
.idea/
|
||||
.vscode/
|
||||
|
||||
# Skills system (local per-installation state)
|
||||
.nanoclaw/
|
||||
|
||||
agents-sdk-docs
|
||||
|
||||
1063
docs/nanoclaw-architecture-final.md
Normal file
1063
docs/nanoclaw-architecture-final.md
Normal file
File diff suppressed because it is too large
Load Diff
168
docs/nanorepo-architecture.md
Normal file
168
docs/nanorepo-architecture.md
Normal file
@@ -0,0 +1,168 @@
|
||||
# NanoClaw Skills Architecture
|
||||
|
||||
## What Skills Are For
|
||||
|
||||
NanoClaw's core is intentionally minimal. Skills are how users extend it — adding features, integrations, cross-platform compatibility, or replacing internals entirely. Examples: add a Telegram or WhatsApp channel, swap the underlying agent runtime, integrate a vector database, add authentication providers, enable multi-language support. Each skill modifies the actual codebase — injecting routes, middleware, config blocks, dependencies — rather than working through a plugin API or runtime hooks.
|
||||
|
||||
## Why This Architecture
|
||||
|
||||
The problem: users need to combine multiple modifications to a shared codebase, keep those modifications working across core updates, and do all of this without becoming git experts or losing their custom changes. A plugin system would be simpler but constrains what skills can do. Giving skills full codebase access means they can change anything, but that creates merge conflicts, update breakage, and state tracking challenges.
|
||||
|
||||
This architecture solves that by making skill application fully programmatic using standard git mechanics, with AI as a fallback for conflicts git can't resolve, and a shared resolution cache so most users never hit those conflicts at all. The result: users compose exactly the features they want, customizations survive core updates automatically, and the system is always recoverable.
|
||||
|
||||
## Core Principle
|
||||
|
||||
Skills are self-contained, auditable packages applied via standard git merge mechanics. Claude Code orchestrates the process — running git commands, reading skill manifests, and stepping in only when git can't resolve a conflict. The system uses existing git features (`merge-file`, `rerere`, `apply`) rather than custom merge infrastructure.
|
||||
|
||||
## Three-Level Resolution Model
|
||||
|
||||
Every operation follows this escalation:
|
||||
|
||||
1. **Git** — deterministic. `git merge-file` merges, `git rerere` replays cached resolutions, structured operations apply without merging. No AI. Handles the vast majority of cases.
|
||||
2. **Claude Code** — reads `SKILL.md`, `.intent.md`, and `state.yaml` to resolve conflicts git can't handle. Caches resolutions via `git rerere` so the same conflict never needs resolving twice.
|
||||
3. **Claude Code + user input** — when Claude Code lacks sufficient context to determine intent (e.g., two features genuinely conflict at an application level), it asks the user for a decision, then uses that input to perform the resolution. Claude Code still does the work — the user provides direction, not code.
|
||||
|
||||
**Important**: A clean merge doesn't guarantee working code. Semantic conflicts can produce clean text merges that break at runtime. **Tests run after every operation.**
|
||||
|
||||
## Backup/Restore Safety
|
||||
|
||||
Before any operation, all affected files are copied to `.nanoclaw/backup/`. On success, backup is deleted. On failure, backup is restored. Works safely for users who don't use git.
|
||||
|
||||
## The Shared Base
|
||||
|
||||
`.nanoclaw/base/` holds a clean copy of the core codebase. This is the single common ancestor for all three-way merges, only updated during core updates.
|
||||
|
||||
## Two Types of Changes
|
||||
|
||||
### Code Files (Three-Way Merge)
|
||||
Source code where skills weave in logic. Merged via `git merge-file` against the shared base. Skills carry full modified files.
|
||||
|
||||
### Structured Data (Deterministic Operations)
|
||||
Files like `package.json`, `docker-compose.yml`, `.env.example`. Skills declare requirements in the manifest; the system applies them programmatically. Multiple skills' declarations are batched — dependencies merged, `package.json` written once, `npm install` run once.
|
||||
|
||||
```yaml
|
||||
structured:
|
||||
npm_dependencies:
|
||||
whatsapp-web.js: "^2.1.0"
|
||||
env_additions:
|
||||
- WHATSAPP_TOKEN
|
||||
docker_compose_services:
|
||||
whatsapp-redis:
|
||||
image: redis:alpine
|
||||
ports: ["6380:6379"]
|
||||
```
|
||||
|
||||
Structured conflicts (version incompatibilities, port collisions) follow the same three-level resolution model.
|
||||
|
||||
## Skill Package Structure
|
||||
|
||||
A skill contains only the files it adds or modifies. Modified code files carry the **full file** (clean core + skill's changes), making `git merge-file` straightforward and auditable.
|
||||
|
||||
```
|
||||
skills/add-whatsapp/
|
||||
SKILL.md # What this skill does and why
|
||||
manifest.yaml # Metadata, dependencies, structured ops
|
||||
tests/whatsapp.test.ts # Integration tests
|
||||
add/src/channels/whatsapp.ts # New files
|
||||
modify/src/server.ts # Full modified file for merge
|
||||
modify/src/server.ts.intent.md # Structured intent for conflict resolution
|
||||
```
|
||||
|
||||
### Intent Files
|
||||
Each modified file has a `.intent.md` with structured headings: **What this skill adds**, **Key sections**, **Invariants**, and **Must-keep sections**. These give Claude Code specific guidance during conflict resolution.
|
||||
|
||||
### Manifest
|
||||
Declares: skill metadata, core version compatibility, files added/modified, file operations, structured operations, skill relationships (conflicts, depends, tested_with), post-apply commands, and test command.
|
||||
|
||||
## Customization and Layering
|
||||
|
||||
**One skill, one happy path** — a skill implements the reasonable default for 80% of users.
|
||||
|
||||
**Customization is more patching.** Apply the skill, then modify via tracked patches, direct editing, or additional layered skills. Custom modifications are recorded in `state.yaml` and replayable.
|
||||
|
||||
**Skills layer via `depends`.** Extension skills build on base skills (e.g., `telegram-reactions` depends on `add-telegram`).
|
||||
|
||||
## File Operations
|
||||
|
||||
Renames, deletes, and moves are declared in the manifest and run **before** code merges. When core renames a file, a **path remap** resolves skill references at apply time — skill packages are never mutated.
|
||||
|
||||
## The Apply Flow
|
||||
|
||||
1. Pre-flight checks (compatibility, dependencies, untracked changes)
|
||||
2. Backup
|
||||
3. File operations + path remapping
|
||||
4. Copy new files
|
||||
5. Merge modified code files (`git merge-file`)
|
||||
6. Conflict resolution (shared cache → `git rerere` → Claude Code → Claude Code + user input)
|
||||
7. Apply structured operations (batched)
|
||||
8. Post-apply commands, update `state.yaml`
|
||||
9. **Run tests** (mandatory, even if all merges were clean)
|
||||
10. Clean up (delete backup on success, restore on failure)
|
||||
|
||||
## Shared Resolution Cache
|
||||
|
||||
`.nanoclaw/resolutions/` ships pre-computed, verified conflict resolutions with **hash enforcement** — a cached resolution only applies if base, current, and skill input hashes match exactly. This means most users never encounter unresolved conflicts for common skill combinations.
|
||||
|
||||
### rerere Adapter
|
||||
`git rerere` requires unmerged index entries that `git merge-file` doesn't create. An adapter sets up the required index state after `merge-file` produces a conflict, enabling rerere caching. This requires the project to be a git repository; users without `.git/` lose caching but not functionality.
|
||||
|
||||
## State Tracking
|
||||
|
||||
`.nanoclaw/state.yaml` records: core version, all applied skills (with per-file hashes for base/skill/merged), structured operation outcomes, custom patches, and path remaps. This makes drift detection instant and replay deterministic.
|
||||
|
||||
## Untracked Changes
|
||||
|
||||
Direct edits are detected via hash comparison before any operation. Users can record them as tracked patches, continue untracked, or abort. The three-level model can always recover coherent state from any starting point.
|
||||
|
||||
## Core Updates
|
||||
|
||||
Most changes propagate automatically through three-way merge. **Breaking changes** require a **migration skill** — a regular skill that preserves the old behavior, authored against the new core. Migrations are declared in `migrations.yaml` and applied automatically during updates.
|
||||
|
||||
### Update Flow
|
||||
1. Preview changes (git-only, no files modified)
|
||||
2. Backup → file operations → three-way merge → conflict resolution
|
||||
3. Re-apply custom patches (`git apply --3way`)
|
||||
4. **Update base** to new core
|
||||
5. Apply migration skills (preserves user's setup automatically)
|
||||
6. Re-apply updated skills (version-changed skills only)
|
||||
7. Re-run structured operations → run all tests → clean up
|
||||
|
||||
The user sees no prompts during updates. To accept a new default later, they remove the migration skill.
|
||||
|
||||
## Skill Removal
|
||||
|
||||
Uninstall is **replay without the skill**: read `state.yaml`, remove the target skill, replay all remaining skills from clean base using the resolution cache. Backup for safety.
|
||||
|
||||
## Rebase
|
||||
|
||||
Flatten accumulated layers into a clean starting point. Updates base, regenerates diffs, clears old patches and stale cache entries. Trades individual skill history for simpler future merges.
|
||||
|
||||
## Replay
|
||||
|
||||
Given `state.yaml`, reproduce the exact installation on a fresh machine with no AI (assuming cached resolutions). Apply skills in order, merge, apply custom patches, batch structured operations, run tests.
|
||||
|
||||
## Skill Tests
|
||||
|
||||
Each skill includes integration tests. Tests run **always** — after apply, after update, after uninstall, during replay, in CI. CI tests all official skills individually and pairwise combinations for skills sharing modified files or structured operations.
|
||||
|
||||
## Design Principles
|
||||
|
||||
1. **Use git, don't reinvent it.**
|
||||
2. **Three-level resolution: git → Claude Code → Claude Code + user input.**
|
||||
3. **Clean merges aren't enough.** Tests run after every operation.
|
||||
4. **All operations are safe.** Backup/restore, no half-applied state.
|
||||
5. **One shared base**, only updated on core updates.
|
||||
6. **Code merges vs. structured operations.** Source code is merged; configs are aggregated.
|
||||
7. **Resolutions are learned and shared** with hash enforcement.
|
||||
8. **One skill, one happy path.** Customization is more patching.
|
||||
9. **Skills layer and compose.**
|
||||
10. **Intent is first-class and structured.**
|
||||
11. **State is explicit and complete.** Replay is deterministic.
|
||||
12. **Always recoverable.**
|
||||
13. **Uninstall is replay.**
|
||||
14. **Core updates are the maintainers' responsibility.** Breaking changes require migration skills.
|
||||
15. **File operations and path remapping are first-class.**
|
||||
16. **Skills are tested.** CI tests pairwise by overlap.
|
||||
17. **Deterministic serialization.** No noisy diffs.
|
||||
18. **Rebase when needed.**
|
||||
19. **Progressive core slimming** via migration skills.
|
||||
16
package-lock.json
generated
16
package-lock.json
generated
@@ -15,6 +15,7 @@
|
||||
"pino-pretty": "^13.0.0",
|
||||
"qrcode": "^1.5.4",
|
||||
"qrcode-terminal": "^0.12.0",
|
||||
"yaml": "^2.8.2",
|
||||
"zod": "^4.3.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
@@ -3910,6 +3911,21 @@
|
||||
"integrity": "sha512-JKhqTOwSrqNA1NY5lSztJ1GrBiUodLMmIZuLiDaMRJ+itFd+ABVE8XBjOvIWL+rSqNDC74LCSFmlb/U4UZ4hJQ==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/yaml": {
|
||||
"version": "2.8.2",
|
||||
"resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.2.tgz",
|
||||
"integrity": "sha512-mplynKqc1C2hTVYxd0PU2xQAc22TI1vShAYGksCCfxbn/dFwnHTNi1bvYsBTkhdUNtGIf5xNOg938rrSSYvS9A==",
|
||||
"license": "ISC",
|
||||
"bin": {
|
||||
"yaml": "bin.mjs"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 14.6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/eemeli"
|
||||
}
|
||||
},
|
||||
"node_modules/yargs": {
|
||||
"version": "15.4.1",
|
||||
"resolved": "https://registry.npmjs.org/yargs/-/yargs-15.4.1.tgz",
|
||||
|
||||
@@ -23,6 +23,7 @@
|
||||
"pino-pretty": "^13.0.0",
|
||||
"qrcode": "^1.5.4",
|
||||
"qrcode-terminal": "^0.12.0",
|
||||
"yaml": "^2.8.2",
|
||||
"zod": "^4.3.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
||||
14
scripts/apply-skill.ts
Normal file
14
scripts/apply-skill.ts
Normal file
@@ -0,0 +1,14 @@
|
||||
import { applySkill } from '../skills-engine/apply.js';
|
||||
|
||||
const skillDir = process.argv[2];
|
||||
if (!skillDir) {
|
||||
console.error('Usage: tsx scripts/apply-skill.ts <skill-dir>');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const result = await applySkill(skillDir);
|
||||
console.log(JSON.stringify(result, null, 2));
|
||||
|
||||
if (!result.success) {
|
||||
process.exit(1);
|
||||
}
|
||||
118
scripts/generate-ci-matrix.ts
Normal file
118
scripts/generate-ci-matrix.ts
Normal file
@@ -0,0 +1,118 @@
|
||||
#!/usr/bin/env npx tsx
|
||||
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { parse } from 'yaml';
|
||||
import { SkillManifest } from '../skills-engine/types.js';
|
||||
|
||||
export interface MatrixEntry {
|
||||
skills: string[];
|
||||
reason: string;
|
||||
}
|
||||
|
||||
export interface SkillOverlapInfo {
|
||||
name: string;
|
||||
modifies: string[];
|
||||
npmDependencies: string[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract overlap-relevant info from a parsed manifest.
|
||||
* @param dirName - The skill's directory name (e.g. 'add-discord'), used in matrix
|
||||
* entries so CI/scripts can locate the skill package on disk.
|
||||
*/
|
||||
export function extractOverlapInfo(manifest: SkillManifest, dirName: string): SkillOverlapInfo {
|
||||
const npmDeps = manifest.structured?.npm_dependencies
|
||||
? Object.keys(manifest.structured.npm_dependencies)
|
||||
: [];
|
||||
|
||||
return {
|
||||
name: dirName,
|
||||
modifies: manifest.modifies ?? [],
|
||||
npmDependencies: npmDeps,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Compute overlap matrix from a list of skill overlap infos.
|
||||
* Two skills overlap if they share any `modifies` entry or both declare
|
||||
* `structured.npm_dependencies` for the same package.
|
||||
*/
|
||||
export function computeOverlapMatrix(skills: SkillOverlapInfo[]): MatrixEntry[] {
|
||||
const entries: MatrixEntry[] = [];
|
||||
|
||||
for (let i = 0; i < skills.length; i++) {
|
||||
for (let j = i + 1; j < skills.length; j++) {
|
||||
const a = skills[i];
|
||||
const b = skills[j];
|
||||
const reasons: string[] = [];
|
||||
|
||||
// Check shared modifies entries
|
||||
const sharedModifies = a.modifies.filter((m) => b.modifies.includes(m));
|
||||
if (sharedModifies.length > 0) {
|
||||
reasons.push(`shared modifies: ${sharedModifies.join(', ')}`);
|
||||
}
|
||||
|
||||
// Check shared npm_dependencies packages
|
||||
const sharedNpm = a.npmDependencies.filter((pkg) =>
|
||||
b.npmDependencies.includes(pkg),
|
||||
);
|
||||
if (sharedNpm.length > 0) {
|
||||
reasons.push(`shared npm packages: ${sharedNpm.join(', ')}`);
|
||||
}
|
||||
|
||||
if (reasons.length > 0) {
|
||||
entries.push({
|
||||
skills: [a.name, b.name],
|
||||
reason: reasons.join('; '),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return entries;
|
||||
}
|
||||
|
||||
/**
|
||||
* Read all skill manifests from a skills directory (e.g. .claude/skills/).
|
||||
* Each subdirectory should contain a manifest.yaml.
|
||||
* Returns both the parsed manifest and the directory name.
|
||||
*/
|
||||
export function readAllManifests(skillsDir: string): { manifest: SkillManifest; dirName: string }[] {
|
||||
if (!fs.existsSync(skillsDir)) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const results: { manifest: SkillManifest; dirName: string }[] = [];
|
||||
const entries = fs.readdirSync(skillsDir, { withFileTypes: true });
|
||||
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
|
||||
const manifestPath = path.join(skillsDir, entry.name, 'manifest.yaml');
|
||||
if (!fs.existsSync(manifestPath)) continue;
|
||||
|
||||
const content = fs.readFileSync(manifestPath, 'utf-8');
|
||||
const manifest = parse(content) as SkillManifest;
|
||||
results.push({ manifest, dirName: entry.name });
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate the full CI matrix from a skills directory.
|
||||
*/
|
||||
export function generateMatrix(skillsDir: string): MatrixEntry[] {
|
||||
const entries = readAllManifests(skillsDir);
|
||||
const overlapInfos = entries.map((e) => extractOverlapInfo(e.manifest, e.dirName));
|
||||
return computeOverlapMatrix(overlapInfos);
|
||||
}
|
||||
|
||||
// --- Main ---
|
||||
if (process.argv[1] && path.resolve(process.argv[1]) === path.resolve(import.meta.url.replace('file://', ''))) {
|
||||
const projectRoot = process.cwd();
|
||||
const skillsDir = path.join(projectRoot, '.claude', 'skills');
|
||||
const matrix = generateMatrix(skillsDir);
|
||||
console.log(JSON.stringify(matrix, null, 2));
|
||||
}
|
||||
170
scripts/generate-resolutions.ts
Normal file
170
scripts/generate-resolutions.ts
Normal file
@@ -0,0 +1,170 @@
|
||||
/**
|
||||
* Generate rerere-compatible resolution files for known skill combinations.
|
||||
*
|
||||
* For each conflicting file when applying discord after telegram:
|
||||
* 1. Run merge-file to produce conflict markers
|
||||
* 2. Set up rerere adapter — git records preimage and assigns a hash
|
||||
* 3. Capture the hash by diffing rr-cache before/after
|
||||
* 4. Write the correct resolution, git add + git rerere to record postimage
|
||||
* 5. Save preimage, resolution, hash sidecar, and meta to .claude/resolutions/
|
||||
*/
|
||||
import crypto from 'crypto';
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
import { stringify } from 'yaml';
|
||||
|
||||
import {
|
||||
cleanupMergeState,
|
||||
mergeFile,
|
||||
setupRerereAdapter,
|
||||
} from '../skills-engine/merge.js';
|
||||
import type { FileInputHashes } from '../skills-engine/types.js';
|
||||
|
||||
function sha256(filePath: string): string {
|
||||
const content = fs.readFileSync(filePath);
|
||||
return crypto.createHash('sha256').update(content).digest('hex');
|
||||
}
|
||||
|
||||
const projectRoot = process.cwd();
|
||||
const baseDir = '.nanoclaw/base';
|
||||
|
||||
// The files that conflict when applying discord after telegram
|
||||
const conflictFiles = ['src/index.ts', 'src/config.ts', 'src/routing.test.ts'];
|
||||
|
||||
const telegramModify = '.claude/skills/add-telegram/modify';
|
||||
const discordModify = '.claude/skills/add-discord/modify';
|
||||
const shippedResDir = path.join(projectRoot, '.claude', 'resolutions', 'discord+telegram');
|
||||
|
||||
// Get git rr-cache directory
|
||||
const gitDir = execSync('git rev-parse --git-dir', { encoding: 'utf-8', cwd: projectRoot }).trim();
|
||||
const rrCacheDir = path.join(
|
||||
path.isAbsolute(gitDir) ? gitDir : path.join(projectRoot, gitDir),
|
||||
'rr-cache',
|
||||
);
|
||||
|
||||
function getRrCacheEntries(): Set<string> {
|
||||
if (!fs.existsSync(rrCacheDir)) return new Set();
|
||||
return new Set(fs.readdirSync(rrCacheDir));
|
||||
}
|
||||
|
||||
// Clear rr-cache to start fresh
|
||||
if (fs.existsSync(rrCacheDir)) {
|
||||
fs.rmSync(rrCacheDir, { recursive: true });
|
||||
}
|
||||
fs.mkdirSync(rrCacheDir, { recursive: true });
|
||||
|
||||
// Prepare output directory
|
||||
if (fs.existsSync(shippedResDir)) {
|
||||
fs.rmSync(shippedResDir, { recursive: true });
|
||||
}
|
||||
|
||||
const results: { relPath: string; hash: string }[] = [];
|
||||
const fileHashes: Record<string, FileInputHashes> = {};
|
||||
|
||||
for (const relPath of conflictFiles) {
|
||||
const basePath = path.join(projectRoot, baseDir, relPath);
|
||||
const oursPath = path.join(projectRoot, telegramModify, relPath);
|
||||
const theirsPath = path.join(projectRoot, discordModify, relPath);
|
||||
|
||||
// Resolution = the correct combined file. Read from existing .resolution files.
|
||||
const existingResFile = path.join(shippedResDir, relPath + '.resolution');
|
||||
// The .resolution files were deleted above, so read from the backup copy
|
||||
const resolutionContent = (() => {
|
||||
// Check if we have a backup from a previous run
|
||||
const backupPath = path.join(projectRoot, '.claude', 'resolutions', '_backup', relPath + '.resolution');
|
||||
if (fs.existsSync(backupPath)) return fs.readFileSync(backupPath, 'utf-8');
|
||||
// Fall back to working tree (only works if both skills are applied)
|
||||
const wtPath = path.join(projectRoot, relPath);
|
||||
return fs.readFileSync(wtPath, 'utf-8');
|
||||
})();
|
||||
|
||||
// Do the merge to produce conflict markers
|
||||
const tmpFile = path.join(os.tmpdir(), `nanoclaw-gen-${Date.now()}-${path.basename(relPath)}`);
|
||||
fs.copyFileSync(oursPath, tmpFile);
|
||||
const result = mergeFile(tmpFile, basePath, theirsPath);
|
||||
|
||||
if (result.clean) {
|
||||
console.log(`${relPath}: clean merge, no resolution needed`);
|
||||
fs.unlinkSync(tmpFile);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Compute input file hashes for this conflicted file
|
||||
fileHashes[relPath] = {
|
||||
base: sha256(basePath),
|
||||
current: sha256(oursPath), // "ours" = telegram's modify (current state after first skill)
|
||||
skill: sha256(theirsPath), // "theirs" = discord's modify (the skill being applied)
|
||||
};
|
||||
|
||||
const preimageContent = fs.readFileSync(tmpFile, 'utf-8');
|
||||
fs.unlinkSync(tmpFile);
|
||||
|
||||
// Save original working tree file to restore later
|
||||
const origContent = fs.readFileSync(path.join(projectRoot, relPath), 'utf-8');
|
||||
|
||||
// Write conflict markers to working tree for rerere
|
||||
fs.writeFileSync(path.join(projectRoot, relPath), preimageContent);
|
||||
|
||||
// Track rr-cache entries before
|
||||
const entriesBefore = getRrCacheEntries();
|
||||
|
||||
// Set up rerere adapter and let git record the preimage
|
||||
const baseContent = fs.readFileSync(basePath, 'utf-8');
|
||||
const oursContent = fs.readFileSync(oursPath, 'utf-8');
|
||||
const theirsContent = fs.readFileSync(theirsPath, 'utf-8');
|
||||
setupRerereAdapter(relPath, baseContent, oursContent, theirsContent);
|
||||
execSync('git rerere', { stdio: 'pipe', cwd: projectRoot });
|
||||
|
||||
// Find the new rr-cache entry (the hash)
|
||||
const entriesAfter = getRrCacheEntries();
|
||||
const newEntries = [...entriesAfter].filter((e) => !entriesBefore.has(e));
|
||||
|
||||
if (newEntries.length !== 1) {
|
||||
console.error(`${relPath}: expected 1 new rr-cache entry, got ${newEntries.length}`);
|
||||
cleanupMergeState(relPath);
|
||||
fs.writeFileSync(path.join(projectRoot, relPath), origContent);
|
||||
continue;
|
||||
}
|
||||
|
||||
const hash = newEntries[0];
|
||||
|
||||
// Write the resolution and record it
|
||||
fs.writeFileSync(path.join(projectRoot, relPath), resolutionContent);
|
||||
execSync(`git add "${relPath}"`, { stdio: 'pipe', cwd: projectRoot });
|
||||
execSync('git rerere', { stdio: 'pipe', cwd: projectRoot });
|
||||
|
||||
// Clean up
|
||||
cleanupMergeState(relPath);
|
||||
fs.writeFileSync(path.join(projectRoot, relPath), origContent);
|
||||
|
||||
// Save to .claude/resolutions/
|
||||
const outDir = path.join(shippedResDir, path.dirname(relPath));
|
||||
fs.mkdirSync(outDir, { recursive: true });
|
||||
|
||||
const baseName = path.join(shippedResDir, relPath);
|
||||
// Copy preimage and postimage directly from rr-cache (normalized by git)
|
||||
fs.copyFileSync(path.join(rrCacheDir, hash, 'preimage'), baseName + '.preimage');
|
||||
fs.writeFileSync(baseName + '.resolution', resolutionContent);
|
||||
fs.writeFileSync(baseName + '.preimage.hash', hash);
|
||||
|
||||
results.push({ relPath, hash });
|
||||
console.log(`${relPath}: hash=${hash}`);
|
||||
}
|
||||
|
||||
// Write meta.yaml
|
||||
const meta = {
|
||||
skills: ['discord', 'telegram'],
|
||||
apply_order: ['telegram', 'discord'],
|
||||
resolved_at: new Date().toISOString(),
|
||||
tested: true,
|
||||
test_passed: true,
|
||||
resolution_source: 'generated',
|
||||
input_hashes: {},
|
||||
output_hash: '',
|
||||
file_hashes: fileHashes,
|
||||
};
|
||||
fs.writeFileSync(path.join(shippedResDir, 'meta.yaml'), stringify(meta));
|
||||
|
||||
console.log(`\nGenerated ${results.length} resolution(s) in .claude/resolutions/discord+telegram/`);
|
||||
21
scripts/rebase.ts
Normal file
21
scripts/rebase.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
#!/usr/bin/env npx tsx
|
||||
import { rebase } from '../skills-engine/rebase.js';
|
||||
|
||||
async function main() {
|
||||
const newBasePath = process.argv[2]; // optional
|
||||
|
||||
if (newBasePath) {
|
||||
console.log(`Rebasing with new base from: ${newBasePath}`);
|
||||
} else {
|
||||
console.log('Rebasing current state...');
|
||||
}
|
||||
|
||||
const result = await rebase(newBasePath);
|
||||
console.log(JSON.stringify(result, null, 2));
|
||||
|
||||
if (!result.success) {
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main();
|
||||
144
scripts/run-ci-tests.ts
Normal file
144
scripts/run-ci-tests.ts
Normal file
@@ -0,0 +1,144 @@
|
||||
#!/usr/bin/env npx tsx
|
||||
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
|
||||
import { generateMatrix, MatrixEntry } from './generate-ci-matrix.js';
|
||||
|
||||
interface TestResult {
|
||||
entry: MatrixEntry;
|
||||
passed: boolean;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
function copyDirRecursive(src: string, dest: string, exclude: string[] = []): void {
|
||||
fs.mkdirSync(dest, { recursive: true });
|
||||
for (const entry of fs.readdirSync(src, { withFileTypes: true })) {
|
||||
if (exclude.includes(entry.name)) continue;
|
||||
const srcPath = path.join(src, entry.name);
|
||||
const destPath = path.join(dest, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
copyDirRecursive(srcPath, destPath, exclude);
|
||||
} else {
|
||||
fs.copyFileSync(srcPath, destPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function runMatrixEntry(
|
||||
projectRoot: string,
|
||||
entry: MatrixEntry,
|
||||
): Promise<TestResult> {
|
||||
const tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-ci-'));
|
||||
|
||||
try {
|
||||
// Copy project to temp dir (exclude heavy/irrelevant dirs)
|
||||
copyDirRecursive(projectRoot, tmpDir, [
|
||||
'node_modules',
|
||||
'.git',
|
||||
'dist',
|
||||
'data',
|
||||
'store',
|
||||
'logs',
|
||||
'.nanoclaw',
|
||||
]);
|
||||
|
||||
// Install dependencies
|
||||
execSync('npm install --ignore-scripts', {
|
||||
cwd: tmpDir,
|
||||
stdio: 'pipe',
|
||||
timeout: 120_000,
|
||||
});
|
||||
|
||||
// Initialize nanoclaw dir
|
||||
execSync('npx tsx -e "import { initNanoclawDir } from \'./skills-engine/index.js\'; initNanoclawDir();"', {
|
||||
cwd: tmpDir,
|
||||
stdio: 'pipe',
|
||||
timeout: 30_000,
|
||||
});
|
||||
|
||||
// Apply each skill in sequence
|
||||
for (const skillName of entry.skills) {
|
||||
const skillDir = path.join(tmpDir, '.claude', 'skills', skillName);
|
||||
if (!fs.existsSync(skillDir)) {
|
||||
return {
|
||||
entry,
|
||||
passed: false,
|
||||
error: `Skill directory not found: ${skillName}`,
|
||||
};
|
||||
}
|
||||
|
||||
const result = execSync(
|
||||
`npx tsx scripts/apply-skill.ts "${skillDir}"`,
|
||||
{ cwd: tmpDir, stdio: 'pipe', timeout: 120_000 },
|
||||
);
|
||||
const parsed = JSON.parse(result.toString());
|
||||
if (!parsed.success) {
|
||||
return {
|
||||
entry,
|
||||
passed: false,
|
||||
error: `Failed to apply skill ${skillName}: ${parsed.error}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Run all skill tests
|
||||
execSync('npx vitest run --config vitest.skills.config.ts', {
|
||||
cwd: tmpDir,
|
||||
stdio: 'pipe',
|
||||
timeout: 300_000,
|
||||
});
|
||||
|
||||
return { entry, passed: true };
|
||||
} catch (err: any) {
|
||||
return {
|
||||
entry,
|
||||
passed: false,
|
||||
error: err.message || String(err),
|
||||
};
|
||||
} finally {
|
||||
fs.rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
}
|
||||
|
||||
// --- Main ---
|
||||
async function main(): Promise<void> {
|
||||
const projectRoot = process.cwd();
|
||||
const skillsDir = path.join(projectRoot, '.claude', 'skills');
|
||||
const matrix = generateMatrix(skillsDir);
|
||||
|
||||
if (matrix.length === 0) {
|
||||
console.log('No overlapping skills found. Nothing to test.');
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
console.log(`Found ${matrix.length} overlapping skill combination(s):\n`);
|
||||
for (const entry of matrix) {
|
||||
console.log(` [${entry.skills.join(', ')}] — ${entry.reason}`);
|
||||
}
|
||||
console.log('');
|
||||
|
||||
const results: TestResult[] = [];
|
||||
for (const entry of matrix) {
|
||||
console.log(`Testing: [${entry.skills.join(', ')}]...`);
|
||||
const result = await runMatrixEntry(projectRoot, entry);
|
||||
results.push(result);
|
||||
console.log(` ${result.passed ? 'PASS' : 'FAIL'}${result.error ? ` — ${result.error}` : ''}`);
|
||||
}
|
||||
|
||||
console.log('\n--- Summary ---');
|
||||
const passed = results.filter((r) => r.passed).length;
|
||||
const failed = results.filter((r) => !r.passed).length;
|
||||
console.log(`${passed} passed, ${failed} failed out of ${results.length} combination(s)`);
|
||||
|
||||
if (failed > 0) {
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error('Fatal error:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
37
scripts/uninstall-skill.ts
Normal file
37
scripts/uninstall-skill.ts
Normal file
@@ -0,0 +1,37 @@
|
||||
#!/usr/bin/env npx tsx
|
||||
import { uninstallSkill } from '../skills-engine/uninstall.js';
|
||||
|
||||
async function main() {
|
||||
const skillName = process.argv[2];
|
||||
if (!skillName) {
|
||||
console.error('Usage: npx tsx scripts/uninstall-skill.ts <skill-name>');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`Uninstalling skill: ${skillName}`);
|
||||
const result = await uninstallSkill(skillName);
|
||||
|
||||
if (result.customPatchWarning) {
|
||||
console.warn(`\nWarning: ${result.customPatchWarning}`);
|
||||
console.warn('To proceed, remove the custom_patch from state.yaml and re-run.');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (!result.success) {
|
||||
console.error(`\nFailed: ${result.error}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`\nSuccessfully uninstalled: ${skillName}`);
|
||||
if (result.replayResults) {
|
||||
console.log('Replay test results:');
|
||||
for (const [name, passed] of Object.entries(result.replayResults)) {
|
||||
console.log(` ${name}: ${passed ? 'PASS' : 'FAIL'}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
36
scripts/update-core.ts
Normal file
36
scripts/update-core.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
#!/usr/bin/env tsx
|
||||
import { applyUpdate, previewUpdate } from '../skills-engine/update.js';
|
||||
|
||||
const newCorePath = process.argv[2];
|
||||
if (!newCorePath) {
|
||||
console.error('Usage: tsx scripts/update-core.ts <path-to-new-core>');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Preview
|
||||
const preview = previewUpdate(newCorePath);
|
||||
console.log('=== Update Preview ===');
|
||||
console.log(`Current version: ${preview.currentVersion}`);
|
||||
console.log(`New version: ${preview.newVersion}`);
|
||||
console.log(`Files changed: ${preview.filesChanged.length}`);
|
||||
if (preview.filesChanged.length > 0) {
|
||||
for (const f of preview.filesChanged) {
|
||||
console.log(` ${f}`);
|
||||
}
|
||||
}
|
||||
if (preview.conflictRisk.length > 0) {
|
||||
console.log(`Conflict risk: ${preview.conflictRisk.join(', ')}`);
|
||||
}
|
||||
if (preview.customPatchesAtRisk.length > 0) {
|
||||
console.log(`Custom patches at risk: ${preview.customPatchesAtRisk.join(', ')}`);
|
||||
}
|
||||
console.log('');
|
||||
|
||||
// Apply
|
||||
console.log('Applying update...');
|
||||
const result = await applyUpdate(newCorePath);
|
||||
console.log(JSON.stringify(result, null, 2));
|
||||
|
||||
if (!result.success) {
|
||||
process.exit(1);
|
||||
}
|
||||
92
skills-engine/__tests__/apply.test.ts
Normal file
92
skills-engine/__tests__/apply.test.ts
Normal file
@@ -0,0 +1,92 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||
|
||||
import { applySkill } from '../apply.js';
|
||||
import {
|
||||
cleanup,
|
||||
createMinimalState,
|
||||
createSkillPackage,
|
||||
createTempDir,
|
||||
initGitRepo,
|
||||
setupNanoclawDir,
|
||||
} from './test-helpers.js';
|
||||
|
||||
describe('apply', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
createMinimalState(tmpDir);
|
||||
initGitRepo(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('rejects when min_skills_system_version is too high', async () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'future-skill',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
min_skills_system_version: '99.0.0',
|
||||
});
|
||||
|
||||
const result = await applySkill(skillDir);
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('99.0.0');
|
||||
});
|
||||
|
||||
it('executes post_apply commands on success', async () => {
|
||||
const markerFile = path.join(tmpDir, 'post-apply-marker.txt');
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'post-test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/newfile.ts'],
|
||||
modifies: [],
|
||||
addFiles: { 'src/newfile.ts': 'export const x = 1;' },
|
||||
post_apply: [`echo "applied" > "${markerFile}"`],
|
||||
});
|
||||
|
||||
const result = await applySkill(skillDir);
|
||||
expect(result.success).toBe(true);
|
||||
expect(fs.existsSync(markerFile)).toBe(true);
|
||||
expect(fs.readFileSync(markerFile, 'utf-8').trim()).toBe('applied');
|
||||
});
|
||||
|
||||
it('rolls back on post_apply failure', async () => {
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
const existingFile = path.join(tmpDir, 'src/existing.ts');
|
||||
fs.writeFileSync(existingFile, 'original content');
|
||||
|
||||
// Set up base for the modified file
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'existing.ts'), 'original content');
|
||||
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'bad-post',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/added.ts'],
|
||||
modifies: [],
|
||||
addFiles: { 'src/added.ts': 'new file' },
|
||||
post_apply: ['false'], // always fails
|
||||
});
|
||||
|
||||
const result = await applySkill(skillDir);
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('post_apply');
|
||||
|
||||
// Added file should be cleaned up
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src/added.ts'))).toBe(false);
|
||||
});
|
||||
});
|
||||
77
skills-engine/__tests__/backup.test.ts
Normal file
77
skills-engine/__tests__/backup.test.ts
Normal file
@@ -0,0 +1,77 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { createBackup, restoreBackup, clearBackup } from '../backup.js';
|
||||
import { createTempDir, setupNanoclawDir, cleanup } from './test-helpers.js';
|
||||
|
||||
describe('backup', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('createBackup copies files and restoreBackup puts them back', () => {
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'app.ts'), 'original content');
|
||||
|
||||
createBackup(['src/app.ts']);
|
||||
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'app.ts'), 'modified content');
|
||||
expect(fs.readFileSync(path.join(tmpDir, 'src', 'app.ts'), 'utf-8')).toBe('modified content');
|
||||
|
||||
restoreBackup();
|
||||
expect(fs.readFileSync(path.join(tmpDir, 'src', 'app.ts'), 'utf-8')).toBe('original content');
|
||||
});
|
||||
|
||||
it('createBackup skips missing files without error', () => {
|
||||
expect(() => createBackup(['does-not-exist.ts'])).not.toThrow();
|
||||
});
|
||||
|
||||
it('clearBackup removes backup directory', () => {
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'app.ts'), 'content');
|
||||
createBackup(['src/app.ts']);
|
||||
|
||||
const backupDir = path.join(tmpDir, '.nanoclaw', 'backup');
|
||||
expect(fs.existsSync(backupDir)).toBe(true);
|
||||
|
||||
clearBackup();
|
||||
expect(fs.existsSync(backupDir)).toBe(false);
|
||||
});
|
||||
|
||||
it('createBackup writes tombstone for non-existent files', () => {
|
||||
createBackup(['src/newfile.ts']);
|
||||
|
||||
const tombstone = path.join(tmpDir, '.nanoclaw', 'backup', 'src', 'newfile.ts.tombstone');
|
||||
expect(fs.existsSync(tombstone)).toBe(true);
|
||||
});
|
||||
|
||||
it('restoreBackup deletes files with tombstone markers', () => {
|
||||
// Create backup first — file doesn't exist yet, so tombstone is written
|
||||
createBackup(['src/added.ts']);
|
||||
|
||||
// Now the file gets created (simulating skill apply)
|
||||
const filePath = path.join(tmpDir, 'src', 'added.ts');
|
||||
fs.mkdirSync(path.dirname(filePath), { recursive: true });
|
||||
fs.writeFileSync(filePath, 'new content');
|
||||
expect(fs.existsSync(filePath)).toBe(true);
|
||||
|
||||
// Restore should delete the file (tombstone means it didn't exist before)
|
||||
restoreBackup();
|
||||
expect(fs.existsSync(filePath)).toBe(false);
|
||||
});
|
||||
|
||||
it('restoreBackup is no-op when backup dir is empty or missing', () => {
|
||||
clearBackup();
|
||||
expect(() => restoreBackup()).not.toThrow();
|
||||
});
|
||||
});
|
||||
270
skills-engine/__tests__/ci-matrix.test.ts
Normal file
270
skills-engine/__tests__/ci-matrix.test.ts
Normal file
@@ -0,0 +1,270 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { stringify } from 'yaml';
|
||||
|
||||
import {
|
||||
computeOverlapMatrix,
|
||||
extractOverlapInfo,
|
||||
generateMatrix,
|
||||
type SkillOverlapInfo,
|
||||
} from '../../scripts/generate-ci-matrix.js';
|
||||
import { SkillManifest } from '../types.js';
|
||||
import { createTempDir, cleanup } from './test-helpers.js';
|
||||
|
||||
function makeManifest(overrides: Partial<SkillManifest> & { skill: string }): SkillManifest {
|
||||
return {
|
||||
version: '1.0.0',
|
||||
description: 'Test skill',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
describe('ci-matrix', () => {
|
||||
describe('computeOverlapMatrix', () => {
|
||||
it('detects overlap from shared modifies entries', () => {
|
||||
const skills: SkillOverlapInfo[] = [
|
||||
{ name: 'telegram', modifies: ['src/config.ts', 'src/index.ts'], npmDependencies: [] },
|
||||
{ name: 'discord', modifies: ['src/config.ts', 'src/router.ts'], npmDependencies: [] },
|
||||
];
|
||||
|
||||
const matrix = computeOverlapMatrix(skills);
|
||||
|
||||
expect(matrix).toHaveLength(1);
|
||||
expect(matrix[0].skills).toEqual(['telegram', 'discord']);
|
||||
expect(matrix[0].reason).toContain('shared modifies');
|
||||
expect(matrix[0].reason).toContain('src/config.ts');
|
||||
});
|
||||
|
||||
it('returns no entry for non-overlapping skills', () => {
|
||||
const skills: SkillOverlapInfo[] = [
|
||||
{ name: 'telegram', modifies: ['src/telegram.ts'], npmDependencies: ['grammy'] },
|
||||
{ name: 'discord', modifies: ['src/discord.ts'], npmDependencies: ['discord.js'] },
|
||||
];
|
||||
|
||||
const matrix = computeOverlapMatrix(skills);
|
||||
|
||||
expect(matrix).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('detects overlap from shared npm dependencies', () => {
|
||||
const skills: SkillOverlapInfo[] = [
|
||||
{ name: 'skill-a', modifies: ['src/a.ts'], npmDependencies: ['lodash', 'zod'] },
|
||||
{ name: 'skill-b', modifies: ['src/b.ts'], npmDependencies: ['zod', 'express'] },
|
||||
];
|
||||
|
||||
const matrix = computeOverlapMatrix(skills);
|
||||
|
||||
expect(matrix).toHaveLength(1);
|
||||
expect(matrix[0].skills).toEqual(['skill-a', 'skill-b']);
|
||||
expect(matrix[0].reason).toContain('shared npm packages');
|
||||
expect(matrix[0].reason).toContain('zod');
|
||||
});
|
||||
|
||||
it('reports both modifies and npm overlap in one entry', () => {
|
||||
const skills: SkillOverlapInfo[] = [
|
||||
{ name: 'skill-a', modifies: ['src/config.ts'], npmDependencies: ['zod'] },
|
||||
{ name: 'skill-b', modifies: ['src/config.ts'], npmDependencies: ['zod'] },
|
||||
];
|
||||
|
||||
const matrix = computeOverlapMatrix(skills);
|
||||
|
||||
expect(matrix).toHaveLength(1);
|
||||
expect(matrix[0].reason).toContain('shared modifies');
|
||||
expect(matrix[0].reason).toContain('shared npm packages');
|
||||
});
|
||||
|
||||
it('handles three skills with pairwise overlaps', () => {
|
||||
const skills: SkillOverlapInfo[] = [
|
||||
{ name: 'a', modifies: ['src/config.ts'], npmDependencies: [] },
|
||||
{ name: 'b', modifies: ['src/config.ts', 'src/router.ts'], npmDependencies: [] },
|
||||
{ name: 'c', modifies: ['src/router.ts'], npmDependencies: [] },
|
||||
];
|
||||
|
||||
const matrix = computeOverlapMatrix(skills);
|
||||
|
||||
// a-b overlap on config.ts, b-c overlap on router.ts, a-c no overlap
|
||||
expect(matrix).toHaveLength(2);
|
||||
expect(matrix[0].skills).toEqual(['a', 'b']);
|
||||
expect(matrix[1].skills).toEqual(['b', 'c']);
|
||||
});
|
||||
|
||||
it('returns empty array for single skill', () => {
|
||||
const skills: SkillOverlapInfo[] = [
|
||||
{ name: 'only', modifies: ['src/config.ts'], npmDependencies: ['zod'] },
|
||||
];
|
||||
|
||||
const matrix = computeOverlapMatrix(skills);
|
||||
|
||||
expect(matrix).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('returns empty array for no skills', () => {
|
||||
const matrix = computeOverlapMatrix([]);
|
||||
expect(matrix).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('extractOverlapInfo', () => {
|
||||
it('extracts modifies and npm dependencies using dirName', () => {
|
||||
const manifest = makeManifest({
|
||||
skill: 'telegram',
|
||||
modifies: ['src/config.ts'],
|
||||
structured: {
|
||||
npm_dependencies: { grammy: '^1.0.0', zod: '^3.0.0' },
|
||||
},
|
||||
});
|
||||
|
||||
const info = extractOverlapInfo(manifest, 'add-telegram');
|
||||
|
||||
expect(info.name).toBe('add-telegram');
|
||||
expect(info.modifies).toEqual(['src/config.ts']);
|
||||
expect(info.npmDependencies).toEqual(['grammy', 'zod']);
|
||||
});
|
||||
|
||||
it('handles manifest without structured field', () => {
|
||||
const manifest = makeManifest({
|
||||
skill: 'simple',
|
||||
modifies: ['src/index.ts'],
|
||||
});
|
||||
|
||||
const info = extractOverlapInfo(manifest, 'add-simple');
|
||||
|
||||
expect(info.npmDependencies).toEqual([]);
|
||||
});
|
||||
|
||||
it('handles structured without npm_dependencies', () => {
|
||||
const manifest = makeManifest({
|
||||
skill: 'env-only',
|
||||
modifies: [],
|
||||
structured: {
|
||||
env_additions: ['MY_VAR'],
|
||||
},
|
||||
});
|
||||
|
||||
const info = extractOverlapInfo(manifest, 'add-env-only');
|
||||
|
||||
expect(info.npmDependencies).toEqual([]);
|
||||
});
|
||||
});
|
||||
|
||||
describe('generateMatrix with real filesystem', () => {
|
||||
let tmpDir: string;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
function createManifestDir(skillsDir: string, name: string, manifest: Record<string, unknown>): void {
|
||||
const dir = path.join(skillsDir, name);
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify(manifest));
|
||||
}
|
||||
|
||||
it('reads manifests from disk and finds overlaps', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
|
||||
createManifestDir(skillsDir, 'telegram', {
|
||||
skill: 'telegram',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/telegram.ts'],
|
||||
modifies: ['src/config.ts', 'src/index.ts'],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
});
|
||||
|
||||
createManifestDir(skillsDir, 'discord', {
|
||||
skill: 'discord',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/discord.ts'],
|
||||
modifies: ['src/config.ts', 'src/index.ts'],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
});
|
||||
|
||||
const matrix = generateMatrix(skillsDir);
|
||||
|
||||
expect(matrix).toHaveLength(1);
|
||||
expect(matrix[0].skills).toContain('telegram');
|
||||
expect(matrix[0].skills).toContain('discord');
|
||||
});
|
||||
|
||||
it('returns empty matrix when skills dir does not exist', () => {
|
||||
const matrix = generateMatrix(path.join(tmpDir, 'nonexistent'));
|
||||
expect(matrix).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('returns empty matrix for non-overlapping skills on disk', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
|
||||
createManifestDir(skillsDir, 'alpha', {
|
||||
skill: 'alpha',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/alpha.ts'],
|
||||
modifies: ['src/alpha-config.ts'],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
});
|
||||
|
||||
createManifestDir(skillsDir, 'beta', {
|
||||
skill: 'beta',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/beta.ts'],
|
||||
modifies: ['src/beta-config.ts'],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
});
|
||||
|
||||
const matrix = generateMatrix(skillsDir);
|
||||
expect(matrix).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('detects structured npm overlap from disk manifests', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
|
||||
createManifestDir(skillsDir, 'skill-x', {
|
||||
skill: 'skill-x',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: ['src/x.ts'],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
structured: {
|
||||
npm_dependencies: { lodash: '^4.0.0' },
|
||||
},
|
||||
});
|
||||
|
||||
createManifestDir(skillsDir, 'skill-y', {
|
||||
skill: 'skill-y',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: ['src/y.ts'],
|
||||
conflicts: [],
|
||||
depends: [],
|
||||
structured: {
|
||||
npm_dependencies: { lodash: '^4.1.0' },
|
||||
},
|
||||
});
|
||||
|
||||
const matrix = generateMatrix(skillsDir);
|
||||
|
||||
expect(matrix).toHaveLength(1);
|
||||
expect(matrix[0].reason).toContain('lodash');
|
||||
});
|
||||
});
|
||||
});
|
||||
43
skills-engine/__tests__/constants.test.ts
Normal file
43
skills-engine/__tests__/constants.test.ts
Normal file
@@ -0,0 +1,43 @@
|
||||
import { describe, it, expect } from 'vitest';
|
||||
import {
|
||||
NANOCLAW_DIR,
|
||||
STATE_FILE,
|
||||
BASE_DIR,
|
||||
BACKUP_DIR,
|
||||
LOCK_FILE,
|
||||
CUSTOM_DIR,
|
||||
RESOLUTIONS_DIR,
|
||||
SKILLS_SCHEMA_VERSION,
|
||||
} from '../constants.js';
|
||||
|
||||
describe('constants', () => {
|
||||
const allConstants = {
|
||||
NANOCLAW_DIR,
|
||||
STATE_FILE,
|
||||
BASE_DIR,
|
||||
BACKUP_DIR,
|
||||
LOCK_FILE,
|
||||
CUSTOM_DIR,
|
||||
RESOLUTIONS_DIR,
|
||||
SKILLS_SCHEMA_VERSION,
|
||||
};
|
||||
|
||||
it('all constants are non-empty strings', () => {
|
||||
for (const [name, value] of Object.entries(allConstants)) {
|
||||
expect(value, `${name} should be a non-empty string`).toBeTruthy();
|
||||
expect(typeof value, `${name} should be a string`).toBe('string');
|
||||
}
|
||||
});
|
||||
|
||||
it('path constants use forward slashes and .nanoclaw prefix', () => {
|
||||
const pathConstants = [BASE_DIR, BACKUP_DIR, LOCK_FILE, CUSTOM_DIR, RESOLUTIONS_DIR];
|
||||
for (const p of pathConstants) {
|
||||
expect(p).not.toContain('\\');
|
||||
expect(p).toMatch(/^\.nanoclaw\//);
|
||||
}
|
||||
});
|
||||
|
||||
it('NANOCLAW_DIR is .nanoclaw', () => {
|
||||
expect(NANOCLAW_DIR).toBe('.nanoclaw');
|
||||
});
|
||||
});
|
||||
136
skills-engine/__tests__/customize.test.ts
Normal file
136
skills-engine/__tests__/customize.test.ts
Normal file
@@ -0,0 +1,136 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import {
|
||||
isCustomizeActive,
|
||||
startCustomize,
|
||||
commitCustomize,
|
||||
abortCustomize,
|
||||
} from '../customize.js';
|
||||
import { CUSTOM_DIR } from '../constants.js';
|
||||
import {
|
||||
createTempDir,
|
||||
setupNanoclawDir,
|
||||
createMinimalState,
|
||||
cleanup,
|
||||
writeState,
|
||||
} from './test-helpers.js';
|
||||
import { readState, recordSkillApplication, computeFileHash } from '../state.js';
|
||||
|
||||
describe('customize', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
createMinimalState(tmpDir);
|
||||
fs.mkdirSync(path.join(tmpDir, CUSTOM_DIR), { recursive: true });
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('startCustomize creates pending.yaml and isCustomizeActive returns true', () => {
|
||||
// Need at least one applied skill with file_hashes for snapshot
|
||||
const trackedFile = path.join(tmpDir, 'src', 'app.ts');
|
||||
fs.mkdirSync(path.dirname(trackedFile), { recursive: true });
|
||||
fs.writeFileSync(trackedFile, 'export const x = 1;');
|
||||
recordSkillApplication('test-skill', '1.0.0', {
|
||||
'src/app.ts': computeFileHash(trackedFile),
|
||||
});
|
||||
|
||||
expect(isCustomizeActive()).toBe(false);
|
||||
startCustomize('test customization');
|
||||
expect(isCustomizeActive()).toBe(true);
|
||||
|
||||
const pendingPath = path.join(tmpDir, CUSTOM_DIR, 'pending.yaml');
|
||||
expect(fs.existsSync(pendingPath)).toBe(true);
|
||||
});
|
||||
|
||||
it('abortCustomize removes pending.yaml', () => {
|
||||
const trackedFile = path.join(tmpDir, 'src', 'app.ts');
|
||||
fs.mkdirSync(path.dirname(trackedFile), { recursive: true });
|
||||
fs.writeFileSync(trackedFile, 'export const x = 1;');
|
||||
recordSkillApplication('test-skill', '1.0.0', {
|
||||
'src/app.ts': computeFileHash(trackedFile),
|
||||
});
|
||||
|
||||
startCustomize('test');
|
||||
expect(isCustomizeActive()).toBe(true);
|
||||
|
||||
abortCustomize();
|
||||
expect(isCustomizeActive()).toBe(false);
|
||||
});
|
||||
|
||||
it('commitCustomize with no changes clears pending', () => {
|
||||
const trackedFile = path.join(tmpDir, 'src', 'app.ts');
|
||||
fs.mkdirSync(path.dirname(trackedFile), { recursive: true });
|
||||
fs.writeFileSync(trackedFile, 'export const x = 1;');
|
||||
recordSkillApplication('test-skill', '1.0.0', {
|
||||
'src/app.ts': computeFileHash(trackedFile),
|
||||
});
|
||||
|
||||
startCustomize('no-op');
|
||||
commitCustomize();
|
||||
|
||||
expect(isCustomizeActive()).toBe(false);
|
||||
});
|
||||
|
||||
it('commitCustomize with changes creates patch and records in state', () => {
|
||||
const trackedFile = path.join(tmpDir, 'src', 'app.ts');
|
||||
fs.mkdirSync(path.dirname(trackedFile), { recursive: true });
|
||||
fs.writeFileSync(trackedFile, 'export const x = 1;');
|
||||
recordSkillApplication('test-skill', '1.0.0', {
|
||||
'src/app.ts': computeFileHash(trackedFile),
|
||||
});
|
||||
|
||||
startCustomize('add feature');
|
||||
|
||||
// Modify the tracked file
|
||||
fs.writeFileSync(trackedFile, 'export const x = 2;\nexport const y = 3;');
|
||||
|
||||
commitCustomize();
|
||||
|
||||
expect(isCustomizeActive()).toBe(false);
|
||||
const state = readState();
|
||||
expect(state.custom_modifications).toBeDefined();
|
||||
expect(state.custom_modifications!.length).toBeGreaterThan(0);
|
||||
expect(state.custom_modifications![0].description).toBe('add feature');
|
||||
});
|
||||
|
||||
it('commitCustomize throws descriptive error on diff failure', () => {
|
||||
const trackedFile = path.join(tmpDir, 'src', 'app.ts');
|
||||
fs.mkdirSync(path.dirname(trackedFile), { recursive: true });
|
||||
fs.writeFileSync(trackedFile, 'export const x = 1;');
|
||||
recordSkillApplication('test-skill', '1.0.0', {
|
||||
'src/app.ts': computeFileHash(trackedFile),
|
||||
});
|
||||
|
||||
startCustomize('diff-error test');
|
||||
|
||||
// Modify the tracked file
|
||||
fs.writeFileSync(trackedFile, 'export const x = 2;');
|
||||
|
||||
// Make the base file a directory to cause diff to exit with code 2
|
||||
const baseFilePath = path.join(tmpDir, '.nanoclaw', 'base', 'src', 'app.ts');
|
||||
fs.mkdirSync(baseFilePath, { recursive: true });
|
||||
|
||||
expect(() => commitCustomize()).toThrow(/diff error/i);
|
||||
});
|
||||
|
||||
it('startCustomize while active throws', () => {
|
||||
const trackedFile = path.join(tmpDir, 'src', 'app.ts');
|
||||
fs.mkdirSync(path.dirname(trackedFile), { recursive: true });
|
||||
fs.writeFileSync(trackedFile, 'export const x = 1;');
|
||||
recordSkillApplication('test-skill', '1.0.0', {
|
||||
'src/app.ts': computeFileHash(trackedFile),
|
||||
});
|
||||
|
||||
startCustomize('first');
|
||||
expect(() => startCustomize('second')).toThrow();
|
||||
});
|
||||
});
|
||||
93
skills-engine/__tests__/file-ops.test.ts
Normal file
93
skills-engine/__tests__/file-ops.test.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { executeFileOps } from '../file-ops.js';
|
||||
import { createTempDir, cleanup } from './test-helpers.js';
|
||||
|
||||
describe('file-ops', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('rename success', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'old.ts'), 'content');
|
||||
const result = executeFileOps([
|
||||
{ type: 'rename', from: 'old.ts', to: 'new.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'new.ts'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'old.ts'))).toBe(false);
|
||||
});
|
||||
|
||||
it('move success', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'file.ts'), 'content');
|
||||
const result = executeFileOps([
|
||||
{ type: 'move', from: 'file.ts', to: 'sub/file.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'sub', 'file.ts'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'file.ts'))).toBe(false);
|
||||
});
|
||||
|
||||
it('delete success', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'remove-me.ts'), 'content');
|
||||
const result = executeFileOps([
|
||||
{ type: 'delete', path: 'remove-me.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'remove-me.ts'))).toBe(false);
|
||||
});
|
||||
|
||||
it('rename target exists produces error', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'a.ts'), 'a');
|
||||
fs.writeFileSync(path.join(tmpDir, 'b.ts'), 'b');
|
||||
const result = executeFileOps([
|
||||
{ type: 'rename', from: 'a.ts', to: 'b.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.errors.length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
it('delete missing file produces warning not error', () => {
|
||||
const result = executeFileOps([
|
||||
{ type: 'delete', path: 'nonexistent.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.warnings.length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
it('move creates destination directory', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'src.ts'), 'content');
|
||||
const result = executeFileOps([
|
||||
{ type: 'move', from: 'src.ts', to: 'deep/nested/dir/src.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'deep', 'nested', 'dir', 'src.ts'))).toBe(true);
|
||||
});
|
||||
|
||||
it('path escape produces error', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'file.ts'), 'content');
|
||||
const result = executeFileOps([
|
||||
{ type: 'rename', from: 'file.ts', to: '../../escaped.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.errors.length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
it('source missing produces error for rename', () => {
|
||||
const result = executeFileOps([
|
||||
{ type: 'rename', from: 'missing.ts', to: 'new.ts' },
|
||||
], tmpDir);
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.errors.length).toBeGreaterThan(0);
|
||||
});
|
||||
});
|
||||
60
skills-engine/__tests__/lock.test.ts
Normal file
60
skills-engine/__tests__/lock.test.ts
Normal file
@@ -0,0 +1,60 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { acquireLock, releaseLock, isLocked } from '../lock.js';
|
||||
import { LOCK_FILE } from '../constants.js';
|
||||
import { createTempDir, cleanup } from './test-helpers.js';
|
||||
|
||||
describe('lock', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
fs.mkdirSync(path.join(tmpDir, '.nanoclaw'), { recursive: true });
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('acquireLock returns a release function', () => {
|
||||
const release = acquireLock();
|
||||
expect(typeof release).toBe('function');
|
||||
expect(fs.existsSync(path.join(tmpDir, LOCK_FILE))).toBe(true);
|
||||
release();
|
||||
});
|
||||
|
||||
it('releaseLock removes the lock file', () => {
|
||||
acquireLock();
|
||||
expect(fs.existsSync(path.join(tmpDir, LOCK_FILE))).toBe(true);
|
||||
releaseLock();
|
||||
expect(fs.existsSync(path.join(tmpDir, LOCK_FILE))).toBe(false);
|
||||
});
|
||||
|
||||
it('acquire after release succeeds', () => {
|
||||
const release1 = acquireLock();
|
||||
release1();
|
||||
const release2 = acquireLock();
|
||||
expect(typeof release2).toBe('function');
|
||||
release2();
|
||||
});
|
||||
|
||||
it('isLocked returns true when locked', () => {
|
||||
const release = acquireLock();
|
||||
expect(isLocked()).toBe(true);
|
||||
release();
|
||||
});
|
||||
|
||||
it('isLocked returns false when released', () => {
|
||||
const release = acquireLock();
|
||||
release();
|
||||
expect(isLocked()).toBe(false);
|
||||
});
|
||||
|
||||
it('isLocked returns false when no lock exists', () => {
|
||||
expect(isLocked()).toBe(false);
|
||||
});
|
||||
});
|
||||
298
skills-engine/__tests__/manifest.test.ts
Normal file
298
skills-engine/__tests__/manifest.test.ts
Normal file
@@ -0,0 +1,298 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { stringify } from 'yaml';
|
||||
import {
|
||||
readManifest,
|
||||
checkCoreVersion,
|
||||
checkDependencies,
|
||||
checkConflicts,
|
||||
checkSystemVersion,
|
||||
} from '../manifest.js';
|
||||
import {
|
||||
createTempDir,
|
||||
setupNanoclawDir,
|
||||
createMinimalState,
|
||||
createSkillPackage,
|
||||
cleanup,
|
||||
writeState,
|
||||
} from './test-helpers.js';
|
||||
import { recordSkillApplication } from '../state.js';
|
||||
|
||||
describe('manifest', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
createMinimalState(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('parses a valid manifest', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'telegram',
|
||||
version: '2.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/telegram.ts'],
|
||||
modifies: ['src/config.ts'],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
expect(manifest.skill).toBe('telegram');
|
||||
expect(manifest.version).toBe('2.0.0');
|
||||
expect(manifest.adds).toEqual(['src/telegram.ts']);
|
||||
expect(manifest.modifies).toEqual(['src/config.ts']);
|
||||
});
|
||||
|
||||
it('throws on missing skill field', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
version: '1.0.0', core_version: '1.0.0', adds: [], modifies: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow();
|
||||
});
|
||||
|
||||
it('throws on missing version field', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', core_version: '1.0.0', adds: [], modifies: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow();
|
||||
});
|
||||
|
||||
it('throws on missing core_version field', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', version: '1.0.0', adds: [], modifies: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow();
|
||||
});
|
||||
|
||||
it('throws on missing adds field', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', version: '1.0.0', core_version: '1.0.0', modifies: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow();
|
||||
});
|
||||
|
||||
it('throws on missing modifies field', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', version: '1.0.0', core_version: '1.0.0', adds: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow();
|
||||
});
|
||||
|
||||
it('throws on path traversal in adds', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', version: '1.0.0', core_version: '1.0.0',
|
||||
adds: ['../etc/passwd'], modifies: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow('Invalid path');
|
||||
});
|
||||
|
||||
it('throws on path traversal in modifies', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', version: '1.0.0', core_version: '1.0.0',
|
||||
adds: [], modifies: ['../../secret.ts'],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow('Invalid path');
|
||||
});
|
||||
|
||||
it('throws on absolute path in adds', () => {
|
||||
const dir = path.join(tmpDir, 'bad-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test', version: '1.0.0', core_version: '1.0.0',
|
||||
adds: ['/etc/passwd'], modifies: [],
|
||||
}));
|
||||
expect(() => readManifest(dir)).toThrow('Invalid path');
|
||||
});
|
||||
|
||||
it('defaults conflicts and depends to empty arrays', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
expect(manifest.conflicts).toEqual([]);
|
||||
expect(manifest.depends).toEqual([]);
|
||||
});
|
||||
|
||||
it('checkCoreVersion returns warning when manifest targets newer core', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '2.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkCoreVersion(manifest);
|
||||
expect(result.warning).toBeTruthy();
|
||||
});
|
||||
|
||||
it('checkCoreVersion returns no warning when versions match', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkCoreVersion(manifest);
|
||||
expect(result.ok).toBe(true);
|
||||
expect(result.warning).toBeFalsy();
|
||||
});
|
||||
|
||||
it('checkDependencies satisfied when deps present', () => {
|
||||
recordSkillApplication('dep-skill', '1.0.0', {});
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
depends: ['dep-skill'],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkDependencies(manifest);
|
||||
expect(result.ok).toBe(true);
|
||||
expect(result.missing).toEqual([]);
|
||||
});
|
||||
|
||||
it('checkDependencies missing when deps not present', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
depends: ['missing-skill'],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkDependencies(manifest);
|
||||
expect(result.ok).toBe(false);
|
||||
expect(result.missing).toContain('missing-skill');
|
||||
});
|
||||
|
||||
it('checkConflicts ok when no conflicts', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
conflicts: [],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkConflicts(manifest);
|
||||
expect(result.ok).toBe(true);
|
||||
expect(result.conflicting).toEqual([]);
|
||||
});
|
||||
|
||||
it('checkConflicts detects conflicting skill', () => {
|
||||
recordSkillApplication('bad-skill', '1.0.0', {});
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
conflicts: ['bad-skill'],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkConflicts(manifest);
|
||||
expect(result.ok).toBe(false);
|
||||
expect(result.conflicting).toContain('bad-skill');
|
||||
});
|
||||
|
||||
it('parses new optional fields (author, license, etc)', () => {
|
||||
const dir = path.join(tmpDir, 'full-pkg');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
author: 'tester',
|
||||
license: 'MIT',
|
||||
min_skills_system_version: '0.1.0',
|
||||
tested_with: ['telegram', 'discord'],
|
||||
post_apply: ['echo done'],
|
||||
}));
|
||||
const manifest = readManifest(dir);
|
||||
expect(manifest.author).toBe('tester');
|
||||
expect(manifest.license).toBe('MIT');
|
||||
expect(manifest.min_skills_system_version).toBe('0.1.0');
|
||||
expect(manifest.tested_with).toEqual(['telegram', 'discord']);
|
||||
expect(manifest.post_apply).toEqual(['echo done']);
|
||||
});
|
||||
|
||||
it('checkSystemVersion passes when not set', () => {
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
});
|
||||
const manifest = readManifest(skillDir);
|
||||
const result = checkSystemVersion(manifest);
|
||||
expect(result.ok).toBe(true);
|
||||
});
|
||||
|
||||
it('checkSystemVersion passes when engine is new enough', () => {
|
||||
const dir = path.join(tmpDir, 'sys-ok');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
min_skills_system_version: '0.1.0',
|
||||
}));
|
||||
const manifest = readManifest(dir);
|
||||
const result = checkSystemVersion(manifest);
|
||||
expect(result.ok).toBe(true);
|
||||
});
|
||||
|
||||
it('checkSystemVersion fails when engine is too old', () => {
|
||||
const dir = path.join(tmpDir, 'sys-fail');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'manifest.yaml'), stringify({
|
||||
skill: 'test',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
min_skills_system_version: '99.0.0',
|
||||
}));
|
||||
const manifest = readManifest(dir);
|
||||
const result = checkSystemVersion(manifest);
|
||||
expect(result.ok).toBe(false);
|
||||
expect(result.error).toContain('99.0.0');
|
||||
});
|
||||
});
|
||||
97
skills-engine/__tests__/merge.test.ts
Normal file
97
skills-engine/__tests__/merge.test.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
import { execSync } from 'child_process';
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { isGitRepo, mergeFile, setupRerereAdapter } from '../merge.js';
|
||||
import { createTempDir, initGitRepo, cleanup } from './test-helpers.js';
|
||||
|
||||
describe('merge', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('isGitRepo returns true in a git repo', () => {
|
||||
initGitRepo(tmpDir);
|
||||
expect(isGitRepo()).toBe(true);
|
||||
});
|
||||
|
||||
it('isGitRepo returns false outside a git repo', () => {
|
||||
expect(isGitRepo()).toBe(false);
|
||||
});
|
||||
|
||||
describe('mergeFile', () => {
|
||||
beforeEach(() => {
|
||||
initGitRepo(tmpDir);
|
||||
});
|
||||
|
||||
it('clean merge with no overlapping changes', () => {
|
||||
const base = path.join(tmpDir, 'base.txt');
|
||||
const current = path.join(tmpDir, 'current.txt');
|
||||
const skill = path.join(tmpDir, 'skill.txt');
|
||||
|
||||
fs.writeFileSync(base, 'line1\nline2\nline3\n');
|
||||
fs.writeFileSync(current, 'line1-modified\nline2\nline3\n');
|
||||
fs.writeFileSync(skill, 'line1\nline2\nline3-modified\n');
|
||||
|
||||
const result = mergeFile(current, base, skill);
|
||||
expect(result.clean).toBe(true);
|
||||
expect(result.exitCode).toBe(0);
|
||||
|
||||
const merged = fs.readFileSync(current, 'utf-8');
|
||||
expect(merged).toContain('line1-modified');
|
||||
expect(merged).toContain('line3-modified');
|
||||
});
|
||||
|
||||
it('setupRerereAdapter cleans stale MERGE_HEAD before proceeding', () => {
|
||||
// Simulate a stale MERGE_HEAD from a previous crash
|
||||
const gitDir = execSync('git rev-parse --git-dir', {
|
||||
cwd: tmpDir,
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
const headHash = execSync('git rev-parse HEAD', {
|
||||
cwd: tmpDir,
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
fs.writeFileSync(path.join(gitDir, 'MERGE_HEAD'), headHash + '\n');
|
||||
fs.writeFileSync(path.join(gitDir, 'MERGE_MSG'), 'stale merge\n');
|
||||
|
||||
// Write a file for the adapter to work with
|
||||
fs.writeFileSync(path.join(tmpDir, 'test.txt'), 'conflicted content');
|
||||
|
||||
// setupRerereAdapter should not throw despite stale MERGE_HEAD
|
||||
expect(() =>
|
||||
setupRerereAdapter('test.txt', 'base', 'ours', 'theirs'),
|
||||
).not.toThrow();
|
||||
|
||||
// MERGE_HEAD should still exist (newly written by setupRerereAdapter)
|
||||
expect(fs.existsSync(path.join(gitDir, 'MERGE_HEAD'))).toBe(true);
|
||||
});
|
||||
|
||||
it('conflict with overlapping changes', () => {
|
||||
const base = path.join(tmpDir, 'base.txt');
|
||||
const current = path.join(tmpDir, 'current.txt');
|
||||
const skill = path.join(tmpDir, 'skill.txt');
|
||||
|
||||
fs.writeFileSync(base, 'line1\nline2\nline3\n');
|
||||
fs.writeFileSync(current, 'line1-ours\nline2\nline3\n');
|
||||
fs.writeFileSync(skill, 'line1-theirs\nline2\nline3\n');
|
||||
|
||||
const result = mergeFile(current, base, skill);
|
||||
expect(result.clean).toBe(false);
|
||||
expect(result.exitCode).toBeGreaterThan(0);
|
||||
|
||||
const merged = fs.readFileSync(current, 'utf-8');
|
||||
expect(merged).toContain('<<<<<<<');
|
||||
expect(merged).toContain('>>>>>>>');
|
||||
});
|
||||
});
|
||||
});
|
||||
77
skills-engine/__tests__/path-remap.test.ts
Normal file
77
skills-engine/__tests__/path-remap.test.ts
Normal file
@@ -0,0 +1,77 @@
|
||||
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||
|
||||
import { loadPathRemap, recordPathRemap, resolvePathRemap } from '../path-remap.js';
|
||||
import {
|
||||
cleanup,
|
||||
createMinimalState,
|
||||
createTempDir,
|
||||
setupNanoclawDir,
|
||||
} from './test-helpers.js';
|
||||
|
||||
describe('path-remap', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
createMinimalState(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
describe('resolvePathRemap', () => {
|
||||
it('returns remapped path when entry exists', () => {
|
||||
const remap = { 'src/old.ts': 'src/new.ts' };
|
||||
expect(resolvePathRemap('src/old.ts', remap)).toBe('src/new.ts');
|
||||
});
|
||||
|
||||
it('returns original path when no remap entry', () => {
|
||||
const remap = { 'src/old.ts': 'src/new.ts' };
|
||||
expect(resolvePathRemap('src/other.ts', remap)).toBe('src/other.ts');
|
||||
});
|
||||
|
||||
it('returns original path when remap is empty', () => {
|
||||
expect(resolvePathRemap('src/file.ts', {})).toBe('src/file.ts');
|
||||
});
|
||||
});
|
||||
|
||||
describe('loadPathRemap', () => {
|
||||
it('returns empty object when no remap in state', () => {
|
||||
const remap = loadPathRemap();
|
||||
expect(remap).toEqual({});
|
||||
});
|
||||
|
||||
it('returns remap from state', () => {
|
||||
recordPathRemap({ 'src/a.ts': 'src/b.ts' });
|
||||
const remap = loadPathRemap();
|
||||
expect(remap).toEqual({ 'src/a.ts': 'src/b.ts' });
|
||||
});
|
||||
});
|
||||
|
||||
describe('recordPathRemap', () => {
|
||||
it('records new remap entries', () => {
|
||||
recordPathRemap({ 'src/old.ts': 'src/new.ts' });
|
||||
expect(loadPathRemap()).toEqual({ 'src/old.ts': 'src/new.ts' });
|
||||
});
|
||||
|
||||
it('merges with existing remap', () => {
|
||||
recordPathRemap({ 'src/a.ts': 'src/b.ts' });
|
||||
recordPathRemap({ 'src/c.ts': 'src/d.ts' });
|
||||
expect(loadPathRemap()).toEqual({
|
||||
'src/a.ts': 'src/b.ts',
|
||||
'src/c.ts': 'src/d.ts',
|
||||
});
|
||||
});
|
||||
|
||||
it('overwrites existing key on conflict', () => {
|
||||
recordPathRemap({ 'src/a.ts': 'src/b.ts' });
|
||||
recordPathRemap({ 'src/a.ts': 'src/c.ts' });
|
||||
expect(loadPathRemap()).toEqual({ 'src/a.ts': 'src/c.ts' });
|
||||
});
|
||||
});
|
||||
});
|
||||
434
skills-engine/__tests__/rebase.test.ts
Normal file
434
skills-engine/__tests__/rebase.test.ts
Normal file
@@ -0,0 +1,434 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||
import { parse } from 'yaml';
|
||||
|
||||
import { rebase } from '../rebase.js';
|
||||
import {
|
||||
cleanup,
|
||||
createMinimalState,
|
||||
createTempDir,
|
||||
initGitRepo,
|
||||
setupNanoclawDir,
|
||||
writeState,
|
||||
} from './test-helpers.js';
|
||||
|
||||
describe('rebase', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
createMinimalState(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('rebase with one skill: patch created, state updated, rebased_at set', async () => {
|
||||
// Set up base file
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'index.ts'), 'const x = 1;\n');
|
||||
|
||||
// Set up working tree with skill modification
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'const x = 1;\nconst y = 2; // added by skill\n',
|
||||
);
|
||||
|
||||
// Write state with applied skill
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'test-skill',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/index.ts': 'abc123',
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
const result = await rebase();
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.filesInPatch).toBeGreaterThan(0);
|
||||
expect(result.rebased_at).toBeDefined();
|
||||
expect(result.patchFile).toBeDefined();
|
||||
|
||||
// Verify patch file exists
|
||||
const patchPath = path.join(tmpDir, '.nanoclaw', 'combined.patch');
|
||||
expect(fs.existsSync(patchPath)).toBe(true);
|
||||
|
||||
const patchContent = fs.readFileSync(patchPath, 'utf-8');
|
||||
expect(patchContent).toContain('added by skill');
|
||||
|
||||
// Verify state was updated
|
||||
const stateContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'state.yaml'),
|
||||
'utf-8',
|
||||
);
|
||||
const state = parse(stateContent);
|
||||
expect(state.rebased_at).toBeDefined();
|
||||
expect(state.applied_skills).toHaveLength(1);
|
||||
expect(state.applied_skills[0].name).toBe('test-skill');
|
||||
|
||||
// File hashes should be updated to actual current values
|
||||
const currentHash = state.applied_skills[0].file_hashes['src/index.ts'];
|
||||
expect(currentHash).toBeDefined();
|
||||
expect(currentHash).not.toBe('abc123'); // Should be recomputed
|
||||
|
||||
// Working tree file should still have the skill's changes
|
||||
const workingContent = fs.readFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(workingContent).toContain('added by skill');
|
||||
});
|
||||
|
||||
it('rebase flattens: base updated to match working tree', async () => {
|
||||
// Set up base file (clean core)
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'index.ts'), 'const x = 1;\n');
|
||||
|
||||
// Working tree has skill modification
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'const x = 1;\nconst y = 2; // skill\n',
|
||||
);
|
||||
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'my-skill',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/index.ts': 'oldhash',
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
const result = await rebase();
|
||||
expect(result.success).toBe(true);
|
||||
|
||||
// Base should now include the skill's changes (flattened)
|
||||
const baseContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'base', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(baseContent).toContain('skill');
|
||||
expect(baseContent).toBe('const x = 1;\nconst y = 2; // skill\n');
|
||||
});
|
||||
|
||||
it('rebase with multiple skills + custom mods: all collapsed into single patch', async () => {
|
||||
// Set up base files
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src', 'index.ts'), 'const x = 1;\n');
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'src', 'config.ts'),
|
||||
'export const port = 3000;\n',
|
||||
);
|
||||
|
||||
// Set up working tree with modifications from multiple skills
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'const x = 1;\nconst y = 2; // skill-a\n',
|
||||
);
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'export const port = 3000;\nexport const host = "0.0.0.0"; // skill-b\n',
|
||||
);
|
||||
// File added by skill
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'plugin.ts'),
|
||||
'export const plugin = true;\n',
|
||||
);
|
||||
|
||||
// Write state with multiple skills and custom modifications
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'skill-a',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/index.ts': 'hash-a1',
|
||||
},
|
||||
},
|
||||
{
|
||||
name: 'skill-b',
|
||||
version: '2.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/config.ts': 'hash-b1',
|
||||
'src/plugin.ts': 'hash-b2',
|
||||
},
|
||||
},
|
||||
],
|
||||
custom_modifications: [
|
||||
{
|
||||
description: 'tweaked config',
|
||||
applied_at: new Date().toISOString(),
|
||||
files_modified: ['src/config.ts'],
|
||||
patch_file: '.nanoclaw/custom/001-tweaked-config.patch',
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
const result = await rebase();
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.filesInPatch).toBeGreaterThanOrEqual(2);
|
||||
|
||||
// Verify combined patch includes changes from both skills
|
||||
const patchContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'combined.patch'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(patchContent).toContain('skill-a');
|
||||
expect(patchContent).toContain('skill-b');
|
||||
|
||||
// Verify state: custom_modifications should be cleared
|
||||
const stateContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'state.yaml'),
|
||||
'utf-8',
|
||||
);
|
||||
const state = parse(stateContent);
|
||||
expect(state.custom_modifications).toBeUndefined();
|
||||
expect(state.rebased_at).toBeDefined();
|
||||
|
||||
// applied_skills should still be present (informational)
|
||||
expect(state.applied_skills).toHaveLength(2);
|
||||
|
||||
// Base should be flattened — include all skill changes
|
||||
const baseIndex = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'base', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(baseIndex).toContain('skill-a');
|
||||
|
||||
const baseConfig = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'base', 'src', 'config.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(baseConfig).toContain('skill-b');
|
||||
});
|
||||
|
||||
it('rebase clears resolution cache', async () => {
|
||||
// Set up base + working tree
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'index.ts'), 'const x = 1;\n');
|
||||
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'const x = 1;\n// skill\n',
|
||||
);
|
||||
|
||||
// Create a fake resolution cache entry
|
||||
const resDir = path.join(tmpDir, '.nanoclaw', 'resolutions', 'skill-a+skill-b');
|
||||
fs.mkdirSync(resDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(resDir, 'meta.yaml'), 'skills: [skill-a, skill-b]\n');
|
||||
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'my-skill',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: { 'src/index.ts': 'hash' },
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
const result = await rebase();
|
||||
expect(result.success).toBe(true);
|
||||
|
||||
// Resolution cache should be cleared
|
||||
const resolutions = fs.readdirSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'resolutions'),
|
||||
);
|
||||
expect(resolutions).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('rebase with new base: base updated, changes merged', async () => {
|
||||
// Set up current base (multi-line so changes don't conflict)
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'src', 'index.ts'),
|
||||
'line1\nline2\nline3\nline4\nline5\nline6\nline7\nline8\n',
|
||||
);
|
||||
|
||||
// Working tree: skill adds at bottom
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'line1\nline2\nline3\nline4\nline5\nline6\nline7\nline8\nskill change\n',
|
||||
);
|
||||
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'my-skill',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/index.ts': 'oldhash',
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
// New base: core update at top
|
||||
const newBase = path.join(tmpDir, 'new-core');
|
||||
fs.mkdirSync(path.join(newBase, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(newBase, 'src', 'index.ts'),
|
||||
'core v2 header\nline1\nline2\nline3\nline4\nline5\nline6\nline7\nline8\n',
|
||||
);
|
||||
|
||||
const result = await rebase(newBase);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.patchFile).toBeDefined();
|
||||
|
||||
// Verify base was updated to new core
|
||||
const baseContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'base', 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(baseContent).toContain('core v2 header');
|
||||
|
||||
// Working tree should have both core v2 and skill changes merged
|
||||
const workingContent = fs.readFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(workingContent).toContain('core v2 header');
|
||||
expect(workingContent).toContain('skill change');
|
||||
|
||||
// State should reflect rebase
|
||||
const stateContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'state.yaml'),
|
||||
'utf-8',
|
||||
);
|
||||
const state = parse(stateContent);
|
||||
expect(state.rebased_at).toBeDefined();
|
||||
});
|
||||
|
||||
it('rebase with new base: conflict returns backupPending', async () => {
|
||||
// Set up current base — short file so changes overlap
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'src', 'index.ts'),
|
||||
'const x = 1;\n',
|
||||
);
|
||||
|
||||
// Working tree: skill replaces the same line
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'const x = 42; // skill override\n',
|
||||
);
|
||||
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'my-skill',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/index.ts': 'oldhash',
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
// New base: also changes the same line — guaranteed conflict
|
||||
const newBase = path.join(tmpDir, 'new-core');
|
||||
fs.mkdirSync(path.join(newBase, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(newBase, 'src', 'index.ts'),
|
||||
'const x = 999; // core v2\n',
|
||||
);
|
||||
|
||||
const result = await rebase(newBase);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.mergeConflicts).toContain('src/index.ts');
|
||||
expect(result.backupPending).toBe(true);
|
||||
expect(result.error).toContain('Merge conflicts');
|
||||
|
||||
// combined.patch should still exist
|
||||
expect(result.patchFile).toBeDefined();
|
||||
const patchPath = path.join(tmpDir, '.nanoclaw', 'combined.patch');
|
||||
expect(fs.existsSync(patchPath)).toBe(true);
|
||||
|
||||
// Working tree should have conflict markers (not rolled back)
|
||||
const workingContent = fs.readFileSync(
|
||||
path.join(tmpDir, 'src', 'index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(workingContent).toContain('<<<<<<<');
|
||||
expect(workingContent).toContain('>>>>>>>');
|
||||
|
||||
// State should NOT be updated yet (conflicts pending)
|
||||
const stateContent = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'state.yaml'),
|
||||
'utf-8',
|
||||
);
|
||||
const state = parse(stateContent);
|
||||
expect(state.rebased_at).toBeUndefined();
|
||||
});
|
||||
|
||||
it('error when no skills applied', async () => {
|
||||
// State has no applied skills (created by createMinimalState)
|
||||
initGitRepo(tmpDir);
|
||||
|
||||
const result = await rebase();
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('No skills applied');
|
||||
expect(result.filesInPatch).toBe(0);
|
||||
});
|
||||
});
|
||||
297
skills-engine/__tests__/replay.test.ts
Normal file
297
skills-engine/__tests__/replay.test.ts
Normal file
@@ -0,0 +1,297 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||
|
||||
import { findSkillDir, replaySkills } from '../replay.js';
|
||||
import {
|
||||
cleanup,
|
||||
createMinimalState,
|
||||
createSkillPackage,
|
||||
createTempDir,
|
||||
initGitRepo,
|
||||
setupNanoclawDir,
|
||||
} from './test-helpers.js';
|
||||
|
||||
describe('replay', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
createMinimalState(tmpDir);
|
||||
initGitRepo(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
describe('findSkillDir', () => {
|
||||
it('finds skill directory by name', () => {
|
||||
const skillsRoot = path.join(tmpDir, '.claude', 'skills', 'telegram');
|
||||
fs.mkdirSync(skillsRoot, { recursive: true });
|
||||
const { stringify } = require('yaml');
|
||||
fs.writeFileSync(
|
||||
path.join(skillsRoot, 'manifest.yaml'),
|
||||
stringify({
|
||||
skill: 'telegram',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: [],
|
||||
}),
|
||||
);
|
||||
|
||||
const result = findSkillDir('telegram', tmpDir);
|
||||
expect(result).toBe(skillsRoot);
|
||||
});
|
||||
|
||||
it('returns null for missing skill', () => {
|
||||
const result = findSkillDir('nonexistent', tmpDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('returns null when .claude/skills does not exist', () => {
|
||||
const result = findSkillDir('anything', tmpDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
describe('replaySkills', () => {
|
||||
it('replays a single skill from base', async () => {
|
||||
// Set up base file
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'config.ts'), 'base content\n');
|
||||
|
||||
// Set up current file (will be overwritten by replay)
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'modified content\n',
|
||||
);
|
||||
|
||||
// Create skill package
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'telegram',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/telegram.ts'],
|
||||
modifies: ['src/config.ts'],
|
||||
addFiles: { 'src/telegram.ts': 'telegram code\n' },
|
||||
modifyFiles: { 'src/config.ts': 'base content\ntelegram config\n' },
|
||||
});
|
||||
|
||||
const result = await replaySkills({
|
||||
skills: ['telegram'],
|
||||
skillDirs: { telegram: skillDir },
|
||||
projectRoot: tmpDir,
|
||||
});
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.perSkill.telegram.success).toBe(true);
|
||||
|
||||
// Added file should exist
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(
|
||||
true,
|
||||
);
|
||||
expect(
|
||||
fs.readFileSync(path.join(tmpDir, 'src', 'telegram.ts'), 'utf-8'),
|
||||
).toBe('telegram code\n');
|
||||
|
||||
// Modified file should be merged from base
|
||||
const config = fs.readFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(config).toContain('telegram config');
|
||||
});
|
||||
|
||||
it('replays two skills in order', async () => {
|
||||
// Set up base
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'config.ts'),
|
||||
'line1\nline2\nline3\nline4\nline5\n',
|
||||
);
|
||||
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'line1\nline2\nline3\nline4\nline5\n',
|
||||
);
|
||||
|
||||
// Skill 1 adds at top
|
||||
const skill1Dir = createSkillPackage(tmpDir, {
|
||||
skill: 'telegram',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/telegram.ts'],
|
||||
modifies: ['src/config.ts'],
|
||||
addFiles: { 'src/telegram.ts': 'tg code' },
|
||||
modifyFiles: {
|
||||
'src/config.ts': 'telegram import\nline1\nline2\nline3\nline4\nline5\n',
|
||||
},
|
||||
dirName: 'skill-pkg-tg',
|
||||
});
|
||||
|
||||
// Skill 2 adds at bottom
|
||||
const skill2Dir = createSkillPackage(tmpDir, {
|
||||
skill: 'discord',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/discord.ts'],
|
||||
modifies: ['src/config.ts'],
|
||||
addFiles: { 'src/discord.ts': 'dc code' },
|
||||
modifyFiles: {
|
||||
'src/config.ts': 'line1\nline2\nline3\nline4\nline5\ndiscord import\n',
|
||||
},
|
||||
dirName: 'skill-pkg-dc',
|
||||
});
|
||||
|
||||
const result = await replaySkills({
|
||||
skills: ['telegram', 'discord'],
|
||||
skillDirs: { telegram: skill1Dir, discord: skill2Dir },
|
||||
projectRoot: tmpDir,
|
||||
});
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.perSkill.telegram.success).toBe(true);
|
||||
expect(result.perSkill.discord.success).toBe(true);
|
||||
|
||||
// Both added files should exist
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(
|
||||
true,
|
||||
);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src', 'discord.ts'))).toBe(
|
||||
true,
|
||||
);
|
||||
|
||||
// Config should have both changes
|
||||
const config = fs.readFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(config).toContain('telegram import');
|
||||
expect(config).toContain('discord import');
|
||||
});
|
||||
|
||||
it('stops on first conflict and does not process later skills', async () => {
|
||||
// After reset, current=base. Skill 1 merges cleanly (changes line 1).
|
||||
// Skill 2 also changes line 1 differently → conflict with skill 1's result.
|
||||
// Skill 3 should NOT be processed due to break-on-conflict.
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'config.ts'), 'line1\n');
|
||||
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'config.ts'), 'line1\n');
|
||||
|
||||
// Skill 1: changes line 1 — merges cleanly since current=base after reset
|
||||
const skill1Dir = createSkillPackage(tmpDir, {
|
||||
skill: 'skill-a',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: ['src/config.ts'],
|
||||
modifyFiles: { 'src/config.ts': 'line1-from-skill-a\n' },
|
||||
dirName: 'skill-pkg-a',
|
||||
});
|
||||
|
||||
// Skill 2: also changes line 1 differently → conflict with skill-a's result
|
||||
const skill2Dir = createSkillPackage(tmpDir, {
|
||||
skill: 'skill-b',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: [],
|
||||
modifies: ['src/config.ts'],
|
||||
modifyFiles: { 'src/config.ts': 'line1-from-skill-b\n' },
|
||||
dirName: 'skill-pkg-b',
|
||||
});
|
||||
|
||||
// Skill 3: adds a new file — should be skipped
|
||||
const skill3Dir = createSkillPackage(tmpDir, {
|
||||
skill: 'skill-c',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/newfile.ts'],
|
||||
modifies: [],
|
||||
addFiles: { 'src/newfile.ts': 'should not appear' },
|
||||
dirName: 'skill-pkg-c',
|
||||
});
|
||||
|
||||
const result = await replaySkills({
|
||||
skills: ['skill-a', 'skill-b', 'skill-c'],
|
||||
skillDirs: { 'skill-a': skill1Dir, 'skill-b': skill2Dir, 'skill-c': skill3Dir },
|
||||
projectRoot: tmpDir,
|
||||
});
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.mergeConflicts).toBeDefined();
|
||||
expect(result.mergeConflicts!.length).toBeGreaterThan(0);
|
||||
// Skill B caused the conflict
|
||||
expect(result.perSkill['skill-b']?.success).toBe(false);
|
||||
// Skill C should NOT have been processed
|
||||
expect(result.perSkill['skill-c']).toBeUndefined();
|
||||
});
|
||||
|
||||
it('returns error for missing skill dir', async () => {
|
||||
const result = await replaySkills({
|
||||
skills: ['missing'],
|
||||
skillDirs: {},
|
||||
projectRoot: tmpDir,
|
||||
});
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('missing');
|
||||
expect(result.perSkill.missing.success).toBe(false);
|
||||
});
|
||||
|
||||
it('resets files to base before replay', async () => {
|
||||
// Set up base
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'config.ts'), 'base content\n');
|
||||
|
||||
// Current has drift
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'drifted content\n',
|
||||
);
|
||||
|
||||
// Also a stale added file
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'stale-add.ts'),
|
||||
'should be removed',
|
||||
);
|
||||
|
||||
const skillDir = createSkillPackage(tmpDir, {
|
||||
skill: 'skill1',
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: ['src/stale-add.ts'],
|
||||
modifies: ['src/config.ts'],
|
||||
addFiles: { 'src/stale-add.ts': 'fresh add' },
|
||||
modifyFiles: { 'src/config.ts': 'base content\nskill addition\n' },
|
||||
});
|
||||
|
||||
const result = await replaySkills({
|
||||
skills: ['skill1'],
|
||||
skillDirs: { skill1: skillDir },
|
||||
projectRoot: tmpDir,
|
||||
});
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
|
||||
// The added file should have the fresh content (not stale)
|
||||
expect(
|
||||
fs.readFileSync(path.join(tmpDir, 'src', 'stale-add.ts'), 'utf-8'),
|
||||
).toBe('fresh add');
|
||||
});
|
||||
});
|
||||
});
|
||||
283
skills-engine/__tests__/resolution-cache.test.ts
Normal file
283
skills-engine/__tests__/resolution-cache.test.ts
Normal file
@@ -0,0 +1,283 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import crypto from 'crypto';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { parse, stringify } from 'yaml';
|
||||
import {
|
||||
findResolutionDir,
|
||||
loadResolutions,
|
||||
saveResolution,
|
||||
} from '../resolution-cache.js';
|
||||
import { createTempDir, setupNanoclawDir, initGitRepo, cleanup } from './test-helpers.js';
|
||||
|
||||
function sha256(content: string): string {
|
||||
return crypto.createHash('sha256').update(content).digest('hex');
|
||||
}
|
||||
|
||||
const dummyHashes = { base: 'aaa', current: 'bbb', skill: 'ccc' };
|
||||
|
||||
describe('resolution-cache', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('findResolutionDir returns null when not found', () => {
|
||||
const result = findResolutionDir(['skill-a', 'skill-b'], tmpDir);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('saveResolution creates directory structure with files and meta', () => {
|
||||
saveResolution(
|
||||
['skill-b', 'skill-a'],
|
||||
[{ relPath: 'src/config.ts', preimage: 'conflict content', resolution: 'resolved content', inputHashes: dummyHashes }],
|
||||
{ core_version: '1.0.0' },
|
||||
tmpDir,
|
||||
);
|
||||
|
||||
// Skills are sorted, so key is "skill-a+skill-b"
|
||||
const resDir = path.join(tmpDir, '.nanoclaw', 'resolutions', 'skill-a+skill-b');
|
||||
expect(fs.existsSync(resDir)).toBe(true);
|
||||
|
||||
// Check preimage and resolution files exist
|
||||
expect(fs.existsSync(path.join(resDir, 'src/config.ts.preimage'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(resDir, 'src/config.ts.resolution'))).toBe(true);
|
||||
|
||||
// Check meta.yaml exists and has expected fields
|
||||
const metaPath = path.join(resDir, 'meta.yaml');
|
||||
expect(fs.existsSync(metaPath)).toBe(true);
|
||||
const meta = parse(fs.readFileSync(metaPath, 'utf-8'));
|
||||
expect(meta.core_version).toBe('1.0.0');
|
||||
expect(meta.skills).toEqual(['skill-a', 'skill-b']);
|
||||
});
|
||||
|
||||
it('saveResolution writes file_hashes to meta.yaml', () => {
|
||||
const hashes = {
|
||||
base: sha256('base content'),
|
||||
current: sha256('current content'),
|
||||
skill: sha256('skill content'),
|
||||
};
|
||||
|
||||
saveResolution(
|
||||
['alpha', 'beta'],
|
||||
[{ relPath: 'src/config.ts', preimage: 'pre', resolution: 'post', inputHashes: hashes }],
|
||||
{},
|
||||
tmpDir,
|
||||
);
|
||||
|
||||
const resDir = path.join(tmpDir, '.nanoclaw', 'resolutions', 'alpha+beta');
|
||||
const meta = parse(fs.readFileSync(path.join(resDir, 'meta.yaml'), 'utf-8'));
|
||||
expect(meta.file_hashes).toBeDefined();
|
||||
expect(meta.file_hashes['src/config.ts']).toEqual(hashes);
|
||||
});
|
||||
|
||||
it('findResolutionDir returns path after save', () => {
|
||||
saveResolution(
|
||||
['alpha', 'beta'],
|
||||
[{ relPath: 'file.ts', preimage: 'pre', resolution: 'post', inputHashes: dummyHashes }],
|
||||
{},
|
||||
tmpDir,
|
||||
);
|
||||
|
||||
const result = findResolutionDir(['alpha', 'beta'], tmpDir);
|
||||
expect(result).not.toBeNull();
|
||||
expect(result).toContain('alpha+beta');
|
||||
});
|
||||
|
||||
it('findResolutionDir finds shipped resolutions in .claude/resolutions', () => {
|
||||
const shippedDir = path.join(tmpDir, '.claude', 'resolutions', 'alpha+beta');
|
||||
fs.mkdirSync(shippedDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(shippedDir, 'meta.yaml'), 'skills: [alpha, beta]\n');
|
||||
|
||||
const result = findResolutionDir(['alpha', 'beta'], tmpDir);
|
||||
expect(result).not.toBeNull();
|
||||
expect(result).toContain('.claude/resolutions/alpha+beta');
|
||||
});
|
||||
|
||||
it('findResolutionDir prefers shipped over project-level', () => {
|
||||
// Create both shipped and project-level
|
||||
const shippedDir = path.join(tmpDir, '.claude', 'resolutions', 'a+b');
|
||||
fs.mkdirSync(shippedDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(shippedDir, 'meta.yaml'), 'skills: [a, b]\n');
|
||||
|
||||
saveResolution(
|
||||
['a', 'b'],
|
||||
[{ relPath: 'f.ts', preimage: 'x', resolution: 'project', inputHashes: dummyHashes }],
|
||||
{},
|
||||
tmpDir,
|
||||
);
|
||||
|
||||
const result = findResolutionDir(['a', 'b'], tmpDir);
|
||||
expect(result).toContain('.claude/resolutions/a+b');
|
||||
});
|
||||
|
||||
it('skills are sorted so order does not matter', () => {
|
||||
saveResolution(
|
||||
['zeta', 'alpha'],
|
||||
[{ relPath: 'f.ts', preimage: 'a', resolution: 'b', inputHashes: dummyHashes }],
|
||||
{},
|
||||
tmpDir,
|
||||
);
|
||||
|
||||
// Find with reversed order should still work
|
||||
const result = findResolutionDir(['alpha', 'zeta'], tmpDir);
|
||||
expect(result).not.toBeNull();
|
||||
|
||||
// Also works with original order
|
||||
const result2 = findResolutionDir(['zeta', 'alpha'], tmpDir);
|
||||
expect(result2).not.toBeNull();
|
||||
expect(result).toBe(result2);
|
||||
});
|
||||
|
||||
describe('loadResolutions hash verification', () => {
|
||||
const baseContent = 'base file content';
|
||||
const currentContent = 'current file content';
|
||||
const skillContent = 'skill file content';
|
||||
const preimageContent = 'preimage with conflict markers';
|
||||
const resolutionContent = 'resolved content';
|
||||
const rerereHash = 'abc123def456';
|
||||
|
||||
function setupResolutionDir(fileHashes: Record<string, any>) {
|
||||
// Create a shipped resolution directory
|
||||
const resDir = path.join(tmpDir, '.claude', 'resolutions', 'alpha+beta');
|
||||
fs.mkdirSync(path.join(resDir, 'src'), { recursive: true });
|
||||
|
||||
// Write preimage, resolution, and hash sidecar
|
||||
fs.writeFileSync(path.join(resDir, 'src/config.ts.preimage'), preimageContent);
|
||||
fs.writeFileSync(path.join(resDir, 'src/config.ts.resolution'), resolutionContent);
|
||||
fs.writeFileSync(path.join(resDir, 'src/config.ts.preimage.hash'), rerereHash);
|
||||
|
||||
// Write meta.yaml
|
||||
const meta: any = {
|
||||
skills: ['alpha', 'beta'],
|
||||
apply_order: ['alpha', 'beta'],
|
||||
core_version: '1.0.0',
|
||||
resolved_at: new Date().toISOString(),
|
||||
tested: true,
|
||||
test_passed: true,
|
||||
resolution_source: 'maintainer',
|
||||
input_hashes: {},
|
||||
output_hash: '',
|
||||
file_hashes: fileHashes,
|
||||
};
|
||||
fs.writeFileSync(path.join(resDir, 'meta.yaml'), stringify(meta));
|
||||
|
||||
return resDir;
|
||||
}
|
||||
|
||||
function setupInputFiles() {
|
||||
// Create base file
|
||||
fs.mkdirSync(path.join(tmpDir, '.nanoclaw', 'base', 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, '.nanoclaw', 'base', 'src', 'config.ts'), baseContent);
|
||||
|
||||
// Create current file
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'config.ts'), currentContent);
|
||||
}
|
||||
|
||||
function createSkillDir() {
|
||||
const skillDir = path.join(tmpDir, 'skill-pkg');
|
||||
fs.mkdirSync(path.join(skillDir, 'modify', 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'modify', 'src', 'config.ts'), skillContent);
|
||||
return skillDir;
|
||||
}
|
||||
|
||||
beforeEach(() => {
|
||||
initGitRepo(tmpDir);
|
||||
});
|
||||
|
||||
it('loads with matching file_hashes', () => {
|
||||
setupInputFiles();
|
||||
const skillDir = createSkillDir();
|
||||
|
||||
setupResolutionDir({
|
||||
'src/config.ts': {
|
||||
base: sha256(baseContent),
|
||||
current: sha256(currentContent),
|
||||
skill: sha256(skillContent),
|
||||
},
|
||||
});
|
||||
|
||||
const result = loadResolutions(['alpha', 'beta'], tmpDir, skillDir);
|
||||
expect(result).toBe(true);
|
||||
|
||||
// Verify rr-cache entry was created
|
||||
const gitDir = path.join(tmpDir, '.git');
|
||||
const cacheEntry = path.join(gitDir, 'rr-cache', rerereHash);
|
||||
expect(fs.existsSync(path.join(cacheEntry, 'preimage'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(cacheEntry, 'postimage'))).toBe(true);
|
||||
});
|
||||
|
||||
it('skips pair with mismatched base hash', () => {
|
||||
setupInputFiles();
|
||||
const skillDir = createSkillDir();
|
||||
|
||||
setupResolutionDir({
|
||||
'src/config.ts': {
|
||||
base: 'wrong_hash',
|
||||
current: sha256(currentContent),
|
||||
skill: sha256(skillContent),
|
||||
},
|
||||
});
|
||||
|
||||
const result = loadResolutions(['alpha', 'beta'], tmpDir, skillDir);
|
||||
expect(result).toBe(false);
|
||||
|
||||
// rr-cache entry should NOT be created
|
||||
const gitDir = path.join(tmpDir, '.git');
|
||||
expect(fs.existsSync(path.join(gitDir, 'rr-cache', rerereHash))).toBe(false);
|
||||
});
|
||||
|
||||
it('skips pair with mismatched current hash', () => {
|
||||
setupInputFiles();
|
||||
const skillDir = createSkillDir();
|
||||
|
||||
setupResolutionDir({
|
||||
'src/config.ts': {
|
||||
base: sha256(baseContent),
|
||||
current: 'wrong_hash',
|
||||
skill: sha256(skillContent),
|
||||
},
|
||||
});
|
||||
|
||||
const result = loadResolutions(['alpha', 'beta'], tmpDir, skillDir);
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it('skips pair with mismatched skill hash', () => {
|
||||
setupInputFiles();
|
||||
const skillDir = createSkillDir();
|
||||
|
||||
setupResolutionDir({
|
||||
'src/config.ts': {
|
||||
base: sha256(baseContent),
|
||||
current: sha256(currentContent),
|
||||
skill: 'wrong_hash',
|
||||
},
|
||||
});
|
||||
|
||||
const result = loadResolutions(['alpha', 'beta'], tmpDir, skillDir);
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it('skips pair with no file_hashes entry for that file', () => {
|
||||
setupInputFiles();
|
||||
const skillDir = createSkillDir();
|
||||
|
||||
// file_hashes exists but doesn't include src/config.ts
|
||||
setupResolutionDir({});
|
||||
|
||||
const result = loadResolutions(['alpha', 'beta'], tmpDir, skillDir);
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
});
|
||||
});
|
||||
120
skills-engine/__tests__/state.test.ts
Normal file
120
skills-engine/__tests__/state.test.ts
Normal file
@@ -0,0 +1,120 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import {
|
||||
readState,
|
||||
writeState,
|
||||
recordSkillApplication,
|
||||
computeFileHash,
|
||||
compareSemver,
|
||||
recordCustomModification,
|
||||
getCustomModifications,
|
||||
} from '../state.js';
|
||||
import {
|
||||
createTempDir,
|
||||
setupNanoclawDir,
|
||||
createMinimalState,
|
||||
writeState as writeStateHelper,
|
||||
cleanup,
|
||||
} from './test-helpers.js';
|
||||
|
||||
describe('state', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
it('readState/writeState roundtrip', () => {
|
||||
const state = {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
};
|
||||
writeState(state);
|
||||
const result = readState();
|
||||
expect(result.skills_system_version).toBe('0.1.0');
|
||||
expect(result.core_version).toBe('1.0.0');
|
||||
expect(result.applied_skills).toEqual([]);
|
||||
});
|
||||
|
||||
it('readState throws when no state file exists', () => {
|
||||
expect(() => readState()).toThrow();
|
||||
});
|
||||
|
||||
it('readState throws when version is newer than current', () => {
|
||||
writeStateHelper(tmpDir, {
|
||||
skills_system_version: '99.0.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
expect(() => readState()).toThrow();
|
||||
});
|
||||
|
||||
it('recordSkillApplication adds a skill', () => {
|
||||
createMinimalState(tmpDir);
|
||||
recordSkillApplication('my-skill', '1.0.0', { 'src/foo.ts': 'abc123' });
|
||||
const state = readState();
|
||||
expect(state.applied_skills).toHaveLength(1);
|
||||
expect(state.applied_skills[0].name).toBe('my-skill');
|
||||
expect(state.applied_skills[0].version).toBe('1.0.0');
|
||||
expect(state.applied_skills[0].file_hashes).toEqual({ 'src/foo.ts': 'abc123' });
|
||||
});
|
||||
|
||||
it('re-applying same skill replaces it', () => {
|
||||
createMinimalState(tmpDir);
|
||||
recordSkillApplication('my-skill', '1.0.0', { 'a.ts': 'hash1' });
|
||||
recordSkillApplication('my-skill', '2.0.0', { 'a.ts': 'hash2' });
|
||||
const state = readState();
|
||||
expect(state.applied_skills).toHaveLength(1);
|
||||
expect(state.applied_skills[0].version).toBe('2.0.0');
|
||||
expect(state.applied_skills[0].file_hashes).toEqual({ 'a.ts': 'hash2' });
|
||||
});
|
||||
|
||||
it('computeFileHash produces consistent sha256', () => {
|
||||
const filePath = path.join(tmpDir, 'hashtest.txt');
|
||||
fs.writeFileSync(filePath, 'hello world');
|
||||
const hash1 = computeFileHash(filePath);
|
||||
const hash2 = computeFileHash(filePath);
|
||||
expect(hash1).toBe(hash2);
|
||||
expect(hash1).toMatch(/^[a-f0-9]{64}$/);
|
||||
});
|
||||
|
||||
describe('compareSemver', () => {
|
||||
it('1.0.0 < 1.1.0', () => {
|
||||
expect(compareSemver('1.0.0', '1.1.0')).toBeLessThan(0);
|
||||
});
|
||||
|
||||
it('0.9.0 < 0.10.0', () => {
|
||||
expect(compareSemver('0.9.0', '0.10.0')).toBeLessThan(0);
|
||||
});
|
||||
|
||||
it('1.0.0 = 1.0.0', () => {
|
||||
expect(compareSemver('1.0.0', '1.0.0')).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
it('recordCustomModification adds to array', () => {
|
||||
createMinimalState(tmpDir);
|
||||
recordCustomModification('tweak', ['src/a.ts'], 'custom/001-tweak.patch');
|
||||
const mods = getCustomModifications();
|
||||
expect(mods).toHaveLength(1);
|
||||
expect(mods[0].description).toBe('tweak');
|
||||
expect(mods[0].files_modified).toEqual(['src/a.ts']);
|
||||
expect(mods[0].patch_file).toBe('custom/001-tweak.patch');
|
||||
});
|
||||
|
||||
it('getCustomModifications returns empty when none recorded', () => {
|
||||
createMinimalState(tmpDir);
|
||||
const mods = getCustomModifications();
|
||||
expect(mods).toEqual([]);
|
||||
});
|
||||
});
|
||||
204
skills-engine/__tests__/structured.test.ts
Normal file
204
skills-engine/__tests__/structured.test.ts
Normal file
@@ -0,0 +1,204 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import {
|
||||
areRangesCompatible,
|
||||
mergeNpmDependencies,
|
||||
mergeEnvAdditions,
|
||||
mergeDockerComposeServices,
|
||||
} from '../structured.js';
|
||||
import { createTempDir, cleanup } from './test-helpers.js';
|
||||
|
||||
describe('structured', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
describe('areRangesCompatible', () => {
|
||||
it('identical versions are compatible', () => {
|
||||
const result = areRangesCompatible('^1.0.0', '^1.0.0');
|
||||
expect(result.compatible).toBe(true);
|
||||
});
|
||||
|
||||
it('compatible ^ ranges resolve to higher', () => {
|
||||
const result = areRangesCompatible('^1.0.0', '^1.1.0');
|
||||
expect(result.compatible).toBe(true);
|
||||
expect(result.resolved).toBe('^1.1.0');
|
||||
});
|
||||
|
||||
it('incompatible major ^ ranges', () => {
|
||||
const result = areRangesCompatible('^1.0.0', '^2.0.0');
|
||||
expect(result.compatible).toBe(false);
|
||||
});
|
||||
|
||||
it('compatible ~ ranges', () => {
|
||||
const result = areRangesCompatible('~1.0.0', '~1.0.3');
|
||||
expect(result.compatible).toBe(true);
|
||||
expect(result.resolved).toBe('~1.0.3');
|
||||
});
|
||||
|
||||
it('mismatched prefixes are incompatible', () => {
|
||||
const result = areRangesCompatible('^1.0.0', '~1.0.0');
|
||||
expect(result.compatible).toBe(false);
|
||||
});
|
||||
|
||||
it('handles double-digit version parts numerically', () => {
|
||||
// ^1.9.0 vs ^1.10.0 — 10 > 9 numerically, but "9" > "10" as strings
|
||||
const result = areRangesCompatible('^1.9.0', '^1.10.0');
|
||||
expect(result.compatible).toBe(true);
|
||||
expect(result.resolved).toBe('^1.10.0');
|
||||
});
|
||||
|
||||
it('handles double-digit patch versions', () => {
|
||||
const result = areRangesCompatible('~1.0.9', '~1.0.10');
|
||||
expect(result.compatible).toBe(true);
|
||||
expect(result.resolved).toBe('~1.0.10');
|
||||
});
|
||||
});
|
||||
|
||||
describe('mergeNpmDependencies', () => {
|
||||
it('adds new dependencies', () => {
|
||||
const pkgPath = path.join(tmpDir, 'package.json');
|
||||
fs.writeFileSync(pkgPath, JSON.stringify({
|
||||
name: 'test',
|
||||
dependencies: { existing: '^1.0.0' },
|
||||
}, null, 2));
|
||||
|
||||
mergeNpmDependencies(pkgPath, { newdep: '^2.0.0' });
|
||||
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8'));
|
||||
expect(pkg.dependencies.newdep).toBe('^2.0.0');
|
||||
expect(pkg.dependencies.existing).toBe('^1.0.0');
|
||||
});
|
||||
|
||||
it('resolves compatible ^ ranges', () => {
|
||||
const pkgPath = path.join(tmpDir, 'package.json');
|
||||
fs.writeFileSync(pkgPath, JSON.stringify({
|
||||
name: 'test',
|
||||
dependencies: { dep: '^1.0.0' },
|
||||
}, null, 2));
|
||||
|
||||
mergeNpmDependencies(pkgPath, { dep: '^1.1.0' });
|
||||
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8'));
|
||||
expect(pkg.dependencies.dep).toBe('^1.1.0');
|
||||
});
|
||||
|
||||
it('sorts devDependencies after merge', () => {
|
||||
const pkgPath = path.join(tmpDir, 'package.json');
|
||||
fs.writeFileSync(pkgPath, JSON.stringify({
|
||||
name: 'test',
|
||||
dependencies: {},
|
||||
devDependencies: { zlib: '^1.0.0', acorn: '^2.0.0' },
|
||||
}, null, 2));
|
||||
|
||||
mergeNpmDependencies(pkgPath, { middle: '^1.0.0' });
|
||||
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8'));
|
||||
const devKeys = Object.keys(pkg.devDependencies);
|
||||
expect(devKeys).toEqual(['acorn', 'zlib']);
|
||||
});
|
||||
|
||||
it('throws on incompatible major versions', () => {
|
||||
const pkgPath = path.join(tmpDir, 'package.json');
|
||||
fs.writeFileSync(pkgPath, JSON.stringify({
|
||||
name: 'test',
|
||||
dependencies: { dep: '^1.0.0' },
|
||||
}, null, 2));
|
||||
|
||||
expect(() => mergeNpmDependencies(pkgPath, { dep: '^2.0.0' })).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe('mergeEnvAdditions', () => {
|
||||
it('adds new variables', () => {
|
||||
const envPath = path.join(tmpDir, '.env.example');
|
||||
fs.writeFileSync(envPath, 'EXISTING_VAR=value\n');
|
||||
|
||||
mergeEnvAdditions(envPath, ['NEW_VAR']);
|
||||
|
||||
const content = fs.readFileSync(envPath, 'utf-8');
|
||||
expect(content).toContain('NEW_VAR=');
|
||||
expect(content).toContain('EXISTING_VAR=value');
|
||||
});
|
||||
|
||||
it('skips existing variables', () => {
|
||||
const envPath = path.join(tmpDir, '.env.example');
|
||||
fs.writeFileSync(envPath, 'MY_VAR=original\n');
|
||||
|
||||
mergeEnvAdditions(envPath, ['MY_VAR']);
|
||||
|
||||
const content = fs.readFileSync(envPath, 'utf-8');
|
||||
// Should not add duplicate - only 1 occurrence of MY_VAR=
|
||||
const matches = content.match(/MY_VAR=/g);
|
||||
expect(matches).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('recognizes lowercase and mixed-case env vars as existing', () => {
|
||||
const envPath = path.join(tmpDir, '.env.example');
|
||||
fs.writeFileSync(envPath, 'my_lower_var=value\nMixed_Case=abc\n');
|
||||
|
||||
mergeEnvAdditions(envPath, ['my_lower_var', 'Mixed_Case']);
|
||||
|
||||
const content = fs.readFileSync(envPath, 'utf-8');
|
||||
// Should not add duplicates
|
||||
const lowerMatches = content.match(/my_lower_var=/g);
|
||||
expect(lowerMatches).toHaveLength(1);
|
||||
const mixedMatches = content.match(/Mixed_Case=/g);
|
||||
expect(mixedMatches).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('creates file if it does not exist', () => {
|
||||
const envPath = path.join(tmpDir, '.env.example');
|
||||
mergeEnvAdditions(envPath, ['NEW_VAR']);
|
||||
|
||||
expect(fs.existsSync(envPath)).toBe(true);
|
||||
const content = fs.readFileSync(envPath, 'utf-8');
|
||||
expect(content).toContain('NEW_VAR=');
|
||||
});
|
||||
});
|
||||
|
||||
describe('mergeDockerComposeServices', () => {
|
||||
it('adds new services', () => {
|
||||
const composePath = path.join(tmpDir, 'docker-compose.yaml');
|
||||
fs.writeFileSync(composePath, 'version: "3"\nservices:\n web:\n image: nginx\n');
|
||||
|
||||
mergeDockerComposeServices(composePath, {
|
||||
redis: { image: 'redis:7' },
|
||||
});
|
||||
|
||||
const content = fs.readFileSync(composePath, 'utf-8');
|
||||
expect(content).toContain('redis');
|
||||
});
|
||||
|
||||
it('skips existing services', () => {
|
||||
const composePath = path.join(tmpDir, 'docker-compose.yaml');
|
||||
fs.writeFileSync(composePath, 'version: "3"\nservices:\n web:\n image: nginx\n');
|
||||
|
||||
mergeDockerComposeServices(composePath, {
|
||||
web: { image: 'apache' },
|
||||
});
|
||||
|
||||
const content = fs.readFileSync(composePath, 'utf-8');
|
||||
expect(content).toContain('nginx');
|
||||
});
|
||||
|
||||
it('throws on port collision', () => {
|
||||
const composePath = path.join(tmpDir, 'docker-compose.yaml');
|
||||
fs.writeFileSync(composePath, 'version: "3"\nservices:\n web:\n image: nginx\n ports:\n - "8080:80"\n');
|
||||
|
||||
expect(() => mergeDockerComposeServices(composePath, {
|
||||
api: { image: 'node', ports: ['8080:3000'] },
|
||||
})).toThrow();
|
||||
});
|
||||
});
|
||||
});
|
||||
99
skills-engine/__tests__/test-helpers.ts
Normal file
99
skills-engine/__tests__/test-helpers.ts
Normal file
@@ -0,0 +1,99 @@
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
import { stringify } from 'yaml';
|
||||
|
||||
export function createTempDir(): string {
|
||||
return fs.mkdtempSync(path.join(os.tmpdir(), 'nanoclaw-test-'));
|
||||
}
|
||||
|
||||
export function setupNanoclawDir(tmpDir: string): void {
|
||||
fs.mkdirSync(path.join(tmpDir, '.nanoclaw', 'base', 'src'), { recursive: true });
|
||||
fs.mkdirSync(path.join(tmpDir, '.nanoclaw', 'backup'), { recursive: true });
|
||||
}
|
||||
|
||||
export function writeState(tmpDir: string, state: any): void {
|
||||
const statePath = path.join(tmpDir, '.nanoclaw', 'state.yaml');
|
||||
fs.writeFileSync(statePath, stringify(state), 'utf-8');
|
||||
}
|
||||
|
||||
export function createMinimalState(tmpDir: string): void {
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
}
|
||||
|
||||
export function createSkillPackage(tmpDir: string, opts: {
|
||||
skill?: string;
|
||||
version?: string;
|
||||
core_version?: string;
|
||||
adds?: string[];
|
||||
modifies?: string[];
|
||||
addFiles?: Record<string, string>;
|
||||
modifyFiles?: Record<string, string>;
|
||||
conflicts?: string[];
|
||||
depends?: string[];
|
||||
test?: string;
|
||||
structured?: any;
|
||||
file_ops?: any[];
|
||||
post_apply?: string[];
|
||||
min_skills_system_version?: string;
|
||||
dirName?: string;
|
||||
}): string {
|
||||
const skillDir = path.join(tmpDir, opts.dirName ?? 'skill-pkg');
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
|
||||
const manifest: Record<string, unknown> = {
|
||||
skill: opts.skill ?? 'test-skill',
|
||||
version: opts.version ?? '1.0.0',
|
||||
description: 'Test skill',
|
||||
core_version: opts.core_version ?? '1.0.0',
|
||||
adds: opts.adds ?? [],
|
||||
modifies: opts.modifies ?? [],
|
||||
conflicts: opts.conflicts ?? [],
|
||||
depends: opts.depends ?? [],
|
||||
test: opts.test,
|
||||
structured: opts.structured,
|
||||
file_ops: opts.file_ops,
|
||||
};
|
||||
if (opts.post_apply) manifest.post_apply = opts.post_apply;
|
||||
if (opts.min_skills_system_version) manifest.min_skills_system_version = opts.min_skills_system_version;
|
||||
|
||||
fs.writeFileSync(path.join(skillDir, 'manifest.yaml'), stringify(manifest));
|
||||
|
||||
if (opts.addFiles) {
|
||||
const addDir = path.join(skillDir, 'add');
|
||||
for (const [relPath, content] of Object.entries(opts.addFiles)) {
|
||||
const fullPath = path.join(addDir, relPath);
|
||||
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||
fs.writeFileSync(fullPath, content);
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.modifyFiles) {
|
||||
const modDir = path.join(skillDir, 'modify');
|
||||
for (const [relPath, content] of Object.entries(opts.modifyFiles)) {
|
||||
const fullPath = path.join(modDir, relPath);
|
||||
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||
fs.writeFileSync(fullPath, content);
|
||||
}
|
||||
}
|
||||
|
||||
return skillDir;
|
||||
}
|
||||
|
||||
export function initGitRepo(dir: string): void {
|
||||
execSync('git init', { cwd: dir, stdio: 'pipe' });
|
||||
execSync('git config user.email "test@test.com"', { cwd: dir, stdio: 'pipe' });
|
||||
execSync('git config user.name "Test"', { cwd: dir, stdio: 'pipe' });
|
||||
execSync('git config rerere.enabled true', { cwd: dir, stdio: 'pipe' });
|
||||
fs.writeFileSync(path.join(dir, '.gitignore'), 'node_modules\n');
|
||||
execSync('git add -A && git commit -m "init"', { cwd: dir, stdio: 'pipe' });
|
||||
}
|
||||
|
||||
export function cleanup(dir: string): void {
|
||||
fs.rmSync(dir, { recursive: true, force: true });
|
||||
}
|
||||
261
skills-engine/__tests__/uninstall.test.ts
Normal file
261
skills-engine/__tests__/uninstall.test.ts
Normal file
@@ -0,0 +1,261 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||
import { stringify } from 'yaml';
|
||||
|
||||
import { uninstallSkill } from '../uninstall.js';
|
||||
import {
|
||||
cleanup,
|
||||
createTempDir,
|
||||
initGitRepo,
|
||||
setupNanoclawDir,
|
||||
writeState,
|
||||
} from './test-helpers.js';
|
||||
|
||||
describe('uninstall', () => {
|
||||
let tmpDir: string;
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
initGitRepo(tmpDir);
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(originalCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
function setupSkillPackage(
|
||||
name: string,
|
||||
opts: {
|
||||
adds?: Record<string, string>;
|
||||
modifies?: Record<string, string>;
|
||||
modifiesBase?: Record<string, string>;
|
||||
} = {},
|
||||
): void {
|
||||
const skillDir = path.join(tmpDir, '.claude', 'skills', name);
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
|
||||
const addsList = Object.keys(opts.adds ?? {});
|
||||
const modifiesList = Object.keys(opts.modifies ?? {});
|
||||
|
||||
fs.writeFileSync(
|
||||
path.join(skillDir, 'manifest.yaml'),
|
||||
stringify({
|
||||
skill: name,
|
||||
version: '1.0.0',
|
||||
core_version: '1.0.0',
|
||||
adds: addsList,
|
||||
modifies: modifiesList,
|
||||
}),
|
||||
);
|
||||
|
||||
if (opts.adds) {
|
||||
const addDir = path.join(skillDir, 'add');
|
||||
for (const [relPath, content] of Object.entries(opts.adds)) {
|
||||
const fullPath = path.join(addDir, relPath);
|
||||
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||
fs.writeFileSync(fullPath, content);
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.modifies) {
|
||||
const modDir = path.join(skillDir, 'modify');
|
||||
for (const [relPath, content] of Object.entries(opts.modifies)) {
|
||||
const fullPath = path.join(modDir, relPath);
|
||||
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||
fs.writeFileSync(fullPath, content);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
it('returns error for non-applied skill', async () => {
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const result = await uninstallSkill('nonexistent');
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('not applied');
|
||||
});
|
||||
|
||||
it('blocks uninstall after rebase', async () => {
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
rebased_at: new Date().toISOString(),
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'telegram',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: { 'src/config.ts': 'abc' },
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const result = await uninstallSkill('telegram');
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('Cannot uninstall');
|
||||
expect(result.error).toContain('after rebase');
|
||||
});
|
||||
|
||||
it('returns custom patch warning', async () => {
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'telegram',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {},
|
||||
custom_patch: '.nanoclaw/custom/001.patch',
|
||||
custom_patch_description: 'My tweak',
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const result = await uninstallSkill('telegram');
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.customPatchWarning).toContain('custom patch');
|
||||
expect(result.customPatchWarning).toContain('My tweak');
|
||||
});
|
||||
|
||||
it('uninstalls only skill → files reset to base', async () => {
|
||||
// Set up base
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'config.ts'), 'base config\n');
|
||||
|
||||
// Set up current files (as if skill was applied)
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'base config\ntelegram config\n',
|
||||
);
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'telegram.ts'),
|
||||
'telegram code\n',
|
||||
);
|
||||
|
||||
// Set up skill package in .claude/skills/
|
||||
setupSkillPackage('telegram', {
|
||||
adds: { 'src/telegram.ts': 'telegram code\n' },
|
||||
modifies: {
|
||||
'src/config.ts': 'base config\ntelegram config\n',
|
||||
},
|
||||
});
|
||||
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'telegram',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/config.ts': 'abc',
|
||||
'src/telegram.ts': 'def',
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const result = await uninstallSkill('telegram');
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.skill).toBe('telegram');
|
||||
|
||||
// config.ts should be reset to base
|
||||
expect(
|
||||
fs.readFileSync(path.join(tmpDir, 'src', 'config.ts'), 'utf-8'),
|
||||
).toBe('base config\n');
|
||||
|
||||
// telegram.ts (add-only) should be removed
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(false);
|
||||
});
|
||||
|
||||
it('uninstalls one of two → other preserved', async () => {
|
||||
// Set up base
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base', 'src');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'config.ts'),
|
||||
'line1\nline2\nline3\nline4\nline5\n',
|
||||
);
|
||||
|
||||
// Current has both skills applied
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'telegram import\nline1\nline2\nline3\nline4\nline5\ndiscord import\n',
|
||||
);
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'telegram.ts'), 'tg code\n');
|
||||
fs.writeFileSync(path.join(tmpDir, 'src', 'discord.ts'), 'dc code\n');
|
||||
|
||||
// Set up both skill packages
|
||||
setupSkillPackage('telegram', {
|
||||
adds: { 'src/telegram.ts': 'tg code\n' },
|
||||
modifies: {
|
||||
'src/config.ts':
|
||||
'telegram import\nline1\nline2\nline3\nline4\nline5\n',
|
||||
},
|
||||
});
|
||||
|
||||
setupSkillPackage('discord', {
|
||||
adds: { 'src/discord.ts': 'dc code\n' },
|
||||
modifies: {
|
||||
'src/config.ts':
|
||||
'line1\nline2\nline3\nline4\nline5\ndiscord import\n',
|
||||
},
|
||||
});
|
||||
|
||||
writeState(tmpDir, {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'telegram',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/config.ts': 'abc',
|
||||
'src/telegram.ts': 'def',
|
||||
},
|
||||
},
|
||||
{
|
||||
name: 'discord',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: {
|
||||
'src/config.ts': 'ghi',
|
||||
'src/discord.ts': 'jkl',
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const result = await uninstallSkill('telegram');
|
||||
expect(result.success).toBe(true);
|
||||
|
||||
// discord.ts should still exist
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src', 'discord.ts'))).toBe(true);
|
||||
|
||||
// telegram.ts should be gone
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src', 'telegram.ts'))).toBe(false);
|
||||
|
||||
// config should have discord import but not telegram
|
||||
const config = fs.readFileSync(
|
||||
path.join(tmpDir, 'src', 'config.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(config).toContain('discord import');
|
||||
expect(config).not.toContain('telegram import');
|
||||
});
|
||||
});
|
||||
413
skills-engine/__tests__/update.test.ts
Normal file
413
skills-engine/__tests__/update.test.ts
Normal file
@@ -0,0 +1,413 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
|
||||
import { stringify } from 'yaml';
|
||||
|
||||
import { cleanup, createTempDir, initGitRepo, setupNanoclawDir } from './test-helpers.js';
|
||||
|
||||
// We need to mock process.cwd() since update.ts uses it
|
||||
let tmpDir: string;
|
||||
|
||||
describe('update', () => {
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir();
|
||||
setupNanoclawDir(tmpDir);
|
||||
initGitRepo(tmpDir);
|
||||
vi.spyOn(process, 'cwd').mockReturnValue(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
vi.restoreAllMocks();
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
function writeStateFile(state: Record<string, unknown>): void {
|
||||
const statePath = path.join(tmpDir, '.nanoclaw', 'state.yaml');
|
||||
fs.writeFileSync(statePath, stringify(state), 'utf-8');
|
||||
}
|
||||
|
||||
function createNewCoreDir(files: Record<string, string>): string {
|
||||
const newCoreDir = path.join(tmpDir, 'new-core');
|
||||
fs.mkdirSync(newCoreDir, { recursive: true });
|
||||
|
||||
for (const [relPath, content] of Object.entries(files)) {
|
||||
const fullPath = path.join(newCoreDir, relPath);
|
||||
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||
fs.writeFileSync(fullPath, content);
|
||||
}
|
||||
|
||||
return newCoreDir;
|
||||
}
|
||||
|
||||
describe('previewUpdate', () => {
|
||||
it('detects new files in update', async () => {
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/new-file.ts': 'export const x = 1;',
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.filesChanged).toContain('src/new-file.ts');
|
||||
expect(preview.currentVersion).toBe('1.0.0');
|
||||
});
|
||||
|
||||
it('detects changed files vs base', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'original');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'modified',
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.filesChanged).toContain('src/index.ts');
|
||||
});
|
||||
|
||||
it('does not list unchanged files', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'same content');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'same content',
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.filesChanged).not.toContain('src/index.ts');
|
||||
});
|
||||
|
||||
it('identifies conflict risk with applied skills', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'original');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [
|
||||
{
|
||||
name: 'telegram',
|
||||
version: '1.0.0',
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: { 'src/index.ts': 'abc123' },
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'updated core',
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.conflictRisk).toContain('src/index.ts');
|
||||
});
|
||||
|
||||
it('identifies custom patches at risk', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/config.ts'), 'original');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
custom_modifications: [
|
||||
{
|
||||
description: 'custom tweak',
|
||||
applied_at: new Date().toISOString(),
|
||||
files_modified: ['src/config.ts'],
|
||||
patch_file: '.nanoclaw/custom/001-tweak.patch',
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/config.ts': 'updated core config',
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.customPatchesAtRisk).toContain('src/config.ts');
|
||||
});
|
||||
|
||||
it('reads version from package.json in new core', async () => {
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'package.json': JSON.stringify({ version: '2.0.0' }),
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.newVersion).toBe('2.0.0');
|
||||
});
|
||||
|
||||
it('detects files deleted in new core', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'keep this');
|
||||
fs.writeFileSync(path.join(baseDir, 'src/removed.ts'), 'delete this');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
// New core only has index.ts — removed.ts is gone
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'keep this',
|
||||
});
|
||||
|
||||
const { previewUpdate } = await import('../update.js');
|
||||
const preview = previewUpdate(newCoreDir);
|
||||
|
||||
expect(preview.filesDeleted).toContain('src/removed.ts');
|
||||
expect(preview.filesChanged).not.toContain('src/removed.ts');
|
||||
});
|
||||
});
|
||||
|
||||
describe('applyUpdate', () => {
|
||||
it('rejects when customize session is active', async () => {
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
// Create the pending.yaml that indicates active customize
|
||||
const customDir = path.join(tmpDir, '.nanoclaw', 'custom');
|
||||
fs.mkdirSync(customDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(customDir, 'pending.yaml'), 'active: true');
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'new content',
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
const result = await applyUpdate(newCoreDir);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toContain('customize session');
|
||||
});
|
||||
|
||||
it('copies new files that do not exist yet', async () => {
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/brand-new.ts': 'export const fresh = true;',
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
const result = await applyUpdate(newCoreDir);
|
||||
|
||||
expect(result.error).toBeUndefined();
|
||||
expect(result.success).toBe(true);
|
||||
expect(
|
||||
fs.readFileSync(path.join(tmpDir, 'src/brand-new.ts'), 'utf-8'),
|
||||
).toBe('export const fresh = true;');
|
||||
});
|
||||
|
||||
it('performs clean three-way merge', async () => {
|
||||
// Set up base
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'src/index.ts'),
|
||||
'line 1\nline 2\nline 3\n',
|
||||
);
|
||||
|
||||
// Current has user changes at the bottom
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src/index.ts'),
|
||||
'line 1\nline 2\nline 3\nuser addition\n',
|
||||
);
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
// New core changes at the top
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'core update\nline 1\nline 2\nline 3\n',
|
||||
'package.json': JSON.stringify({ version: '2.0.0' }),
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
const result = await applyUpdate(newCoreDir);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.newVersion).toBe('2.0.0');
|
||||
|
||||
const merged = fs.readFileSync(
|
||||
path.join(tmpDir, 'src/index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(merged).toContain('core update');
|
||||
expect(merged).toContain('user addition');
|
||||
});
|
||||
|
||||
it('updates base directory after successful merge', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'old base');
|
||||
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'old base');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'new base content',
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
await applyUpdate(newCoreDir);
|
||||
|
||||
const newBase = fs.readFileSync(
|
||||
path.join(tmpDir, '.nanoclaw', 'base', 'src/index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(newBase).toBe('new base content');
|
||||
});
|
||||
|
||||
it('updates core_version in state after success', async () => {
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'package.json': JSON.stringify({ version: '2.0.0' }),
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
const result = await applyUpdate(newCoreDir);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.previousVersion).toBe('1.0.0');
|
||||
expect(result.newVersion).toBe('2.0.0');
|
||||
|
||||
// Verify state file was updated
|
||||
const { readState } = await import('../state.js');
|
||||
const state = readState();
|
||||
expect(state.core_version).toBe('2.0.0');
|
||||
});
|
||||
|
||||
it('restores backup on merge conflict', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(baseDir, 'src/index.ts'),
|
||||
'line 1\nline 2\nline 3\n',
|
||||
);
|
||||
|
||||
// Current has conflicting change on same line
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'src/index.ts'),
|
||||
'line 1\nuser changed line 2\nline 3\n',
|
||||
);
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
// New core also changes line 2 — guaranteed conflict
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'line 1\ncore changed line 2\nline 3\n',
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
const result = await applyUpdate(newCoreDir);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.mergeConflicts).toContain('src/index.ts');
|
||||
expect(result.backupPending).toBe(true);
|
||||
|
||||
// File should have conflict markers (backup preserved, not restored)
|
||||
const content = fs.readFileSync(
|
||||
path.join(tmpDir, 'src/index.ts'),
|
||||
'utf-8',
|
||||
);
|
||||
expect(content).toContain('<<<<<<<');
|
||||
expect(content).toContain('>>>>>>>');
|
||||
});
|
||||
|
||||
it('removes files deleted in new core', async () => {
|
||||
const baseDir = path.join(tmpDir, '.nanoclaw', 'base');
|
||||
fs.mkdirSync(path.join(baseDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(baseDir, 'src/index.ts'), 'keep');
|
||||
fs.writeFileSync(path.join(baseDir, 'src/removed.ts'), 'old content');
|
||||
|
||||
// Working tree has both files
|
||||
fs.mkdirSync(path.join(tmpDir, 'src'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, 'src/index.ts'), 'keep');
|
||||
fs.writeFileSync(path.join(tmpDir, 'src/removed.ts'), 'old content');
|
||||
|
||||
writeStateFile({
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: '1.0.0',
|
||||
applied_skills: [],
|
||||
});
|
||||
|
||||
// New core only has index.ts
|
||||
const newCoreDir = createNewCoreDir({
|
||||
'src/index.ts': 'keep',
|
||||
});
|
||||
|
||||
const { applyUpdate } = await import('../update.js');
|
||||
const result = await applyUpdate(newCoreDir);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src/index.ts'))).toBe(true);
|
||||
expect(fs.existsSync(path.join(tmpDir, 'src/removed.ts'))).toBe(false);
|
||||
});
|
||||
});
|
||||
});
|
||||
397
skills-engine/apply.ts
Normal file
397
skills-engine/apply.ts
Normal file
@@ -0,0 +1,397 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import crypto from 'crypto';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
|
||||
import { clearBackup, createBackup, restoreBackup } from './backup.js';
|
||||
import { NANOCLAW_DIR } from './constants.js';
|
||||
import { copyDir } from './fs-utils.js';
|
||||
import { isCustomizeActive } from './customize.js';
|
||||
import { executeFileOps } from './file-ops.js';
|
||||
import { acquireLock } from './lock.js';
|
||||
import {
|
||||
checkConflicts,
|
||||
checkCoreVersion,
|
||||
checkDependencies,
|
||||
checkSystemVersion,
|
||||
readManifest,
|
||||
} from './manifest.js';
|
||||
import { loadPathRemap, resolvePathRemap } from './path-remap.js';
|
||||
import {
|
||||
cleanupMergeState,
|
||||
isGitRepo,
|
||||
mergeFile,
|
||||
runRerere,
|
||||
setupRerereAdapter,
|
||||
} from './merge.js';
|
||||
import { loadResolutions } from './resolution-cache.js';
|
||||
import { computeFileHash, readState, recordSkillApplication, writeState } from './state.js';
|
||||
import {
|
||||
mergeDockerComposeServices,
|
||||
mergeEnvAdditions,
|
||||
mergeNpmDependencies,
|
||||
runNpmInstall,
|
||||
} from './structured.js';
|
||||
import { ApplyResult } from './types.js';
|
||||
|
||||
export async function applySkill(skillDir: string): Promise<ApplyResult> {
|
||||
const projectRoot = process.cwd();
|
||||
const manifest = readManifest(skillDir);
|
||||
|
||||
// --- Pre-flight checks ---
|
||||
const currentState = readState(); // Validates state exists and version is compatible
|
||||
|
||||
// Check skills system version compatibility
|
||||
const sysCheck = checkSystemVersion(manifest);
|
||||
if (!sysCheck.ok) {
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error: sysCheck.error,
|
||||
};
|
||||
}
|
||||
|
||||
// Check core version compatibility
|
||||
const coreCheck = checkCoreVersion(manifest);
|
||||
if (coreCheck.warning) {
|
||||
console.log(`Warning: ${coreCheck.warning}`);
|
||||
}
|
||||
|
||||
// Block if customize session is active
|
||||
if (isCustomizeActive()) {
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error:
|
||||
'A customize session is active. Run commitCustomize() or abortCustomize() first.',
|
||||
};
|
||||
}
|
||||
|
||||
const deps = checkDependencies(manifest);
|
||||
if (!deps.ok) {
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error: `Missing dependencies: ${deps.missing.join(', ')}`,
|
||||
};
|
||||
}
|
||||
|
||||
const conflicts = checkConflicts(manifest);
|
||||
if (!conflicts.ok) {
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error: `Conflicting skills: ${conflicts.conflicting.join(', ')}`,
|
||||
};
|
||||
}
|
||||
|
||||
// Load path remap for renamed core files
|
||||
const pathRemap = loadPathRemap();
|
||||
|
||||
// Detect drift for modified files
|
||||
const driftFiles: string[] = [];
|
||||
for (const relPath of manifest.modifies) {
|
||||
const resolvedPath = resolvePathRemap(relPath, pathRemap);
|
||||
const currentPath = path.join(projectRoot, resolvedPath);
|
||||
const basePath = path.join(projectRoot, NANOCLAW_DIR, 'base', resolvedPath);
|
||||
|
||||
if (fs.existsSync(currentPath) && fs.existsSync(basePath)) {
|
||||
const currentHash = computeFileHash(currentPath);
|
||||
const baseHash = computeFileHash(basePath);
|
||||
if (currentHash !== baseHash) {
|
||||
driftFiles.push(relPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (driftFiles.length > 0) {
|
||||
console.log(`Drift detected in: ${driftFiles.join(', ')}`);
|
||||
console.log('Three-way merge will be used to reconcile changes.');
|
||||
}
|
||||
|
||||
// --- Acquire lock ---
|
||||
const releaseLock = acquireLock();
|
||||
|
||||
// Track added files so we can remove them on rollback
|
||||
const addedFiles: string[] = [];
|
||||
|
||||
try {
|
||||
// --- Backup ---
|
||||
const filesToBackup = [
|
||||
...manifest.modifies.map((f) => path.join(projectRoot, resolvePathRemap(f, pathRemap))),
|
||||
...manifest.adds.map((f) => path.join(projectRoot, resolvePathRemap(f, pathRemap))),
|
||||
...(manifest.file_ops || [])
|
||||
.filter((op) => op.from)
|
||||
.map((op) => path.join(projectRoot, resolvePathRemap(op.from!, pathRemap))),
|
||||
path.join(projectRoot, 'package.json'),
|
||||
path.join(projectRoot, 'package-lock.json'),
|
||||
path.join(projectRoot, '.env.example'),
|
||||
path.join(projectRoot, 'docker-compose.yml'),
|
||||
];
|
||||
createBackup(filesToBackup);
|
||||
|
||||
// --- File operations (before copy adds, per architecture doc) ---
|
||||
if (manifest.file_ops && manifest.file_ops.length > 0) {
|
||||
const fileOpsResult = executeFileOps(manifest.file_ops, projectRoot);
|
||||
if (!fileOpsResult.success) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error: `File operations failed: ${fileOpsResult.errors.join('; ')}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// --- Copy new files from add/ ---
|
||||
const addDir = path.join(skillDir, 'add');
|
||||
if (fs.existsSync(addDir)) {
|
||||
for (const relPath of manifest.adds) {
|
||||
const resolvedDest = resolvePathRemap(relPath, pathRemap);
|
||||
const destPath = path.join(projectRoot, resolvedDest);
|
||||
if (!fs.existsSync(destPath)) {
|
||||
addedFiles.push(destPath);
|
||||
}
|
||||
// Copy individual file with remap (can't use copyDir when paths differ)
|
||||
const srcPath = path.join(addDir, relPath);
|
||||
if (fs.existsSync(srcPath)) {
|
||||
fs.mkdirSync(path.dirname(destPath), { recursive: true });
|
||||
fs.copyFileSync(srcPath, destPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- Merge modified files ---
|
||||
const mergeConflicts: string[] = [];
|
||||
|
||||
// Load pre-computed resolutions into git's rr-cache before merging
|
||||
const appliedSkillNames = currentState.applied_skills.map((s) => s.name);
|
||||
loadResolutions([...appliedSkillNames, manifest.skill], projectRoot, skillDir);
|
||||
|
||||
for (const relPath of manifest.modifies) {
|
||||
const resolvedPath = resolvePathRemap(relPath, pathRemap);
|
||||
const currentPath = path.join(projectRoot, resolvedPath);
|
||||
const basePath = path.join(projectRoot, NANOCLAW_DIR, 'base', resolvedPath);
|
||||
// skillPath uses original relPath — skill packages are never mutated
|
||||
const skillPath = path.join(skillDir, 'modify', relPath);
|
||||
|
||||
if (!fs.existsSync(skillPath)) {
|
||||
throw new Error(`Skill modified file not found: ${skillPath}`);
|
||||
}
|
||||
|
||||
if (!fs.existsSync(currentPath)) {
|
||||
// File doesn't exist yet — just copy from skill
|
||||
fs.mkdirSync(path.dirname(currentPath), { recursive: true });
|
||||
fs.copyFileSync(skillPath, currentPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!fs.existsSync(basePath)) {
|
||||
// No base — use current as base (first-time apply)
|
||||
fs.mkdirSync(path.dirname(basePath), { recursive: true });
|
||||
fs.copyFileSync(currentPath, basePath);
|
||||
}
|
||||
|
||||
// Three-way merge: current ← base → skill
|
||||
// Save current content before merge overwrites it (needed for rerere stage 2 = "ours")
|
||||
const oursContent = fs.readFileSync(currentPath, 'utf-8');
|
||||
// git merge-file modifies the first argument in-place, so use a temp copy
|
||||
const tmpCurrent = path.join(
|
||||
os.tmpdir(),
|
||||
`nanoclaw-merge-${crypto.randomUUID()}-${path.basename(relPath)}`,
|
||||
);
|
||||
fs.copyFileSync(currentPath, tmpCurrent);
|
||||
|
||||
const result = mergeFile(tmpCurrent, basePath, skillPath);
|
||||
|
||||
if (result.clean) {
|
||||
fs.copyFileSync(tmpCurrent, currentPath);
|
||||
fs.unlinkSync(tmpCurrent);
|
||||
} else {
|
||||
// Copy conflict markers to working tree path BEFORE rerere
|
||||
// rerere looks at the working tree file at relPath, not at tmpCurrent
|
||||
fs.copyFileSync(tmpCurrent, currentPath);
|
||||
fs.unlinkSync(tmpCurrent);
|
||||
|
||||
if (isGitRepo()) {
|
||||
const baseContent = fs.readFileSync(basePath, 'utf-8');
|
||||
const theirsContent = fs.readFileSync(skillPath, 'utf-8');
|
||||
|
||||
setupRerereAdapter(resolvedPath, baseContent, oursContent, theirsContent);
|
||||
const autoResolved = runRerere(currentPath);
|
||||
|
||||
if (autoResolved) {
|
||||
// rerere resolved the conflict — currentPath now has resolved content
|
||||
// Record the resolution: git add + git rerere
|
||||
execFileSync('git', ['add', resolvedPath], { stdio: 'pipe' });
|
||||
execSync('git rerere', { stdio: 'pipe' });
|
||||
cleanupMergeState(resolvedPath);
|
||||
// Unstage the file — cleanupMergeState clears unmerged entries
|
||||
// but the git add above leaves the file staged at stage 0
|
||||
try {
|
||||
execFileSync('git', ['restore', '--staged', resolvedPath], { stdio: 'pipe' });
|
||||
} catch { /* may fail if file is new or not tracked */ }
|
||||
continue;
|
||||
}
|
||||
|
||||
cleanupMergeState(resolvedPath);
|
||||
}
|
||||
|
||||
// Unresolved conflict — currentPath already has conflict markers
|
||||
mergeConflicts.push(relPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (mergeConflicts.length > 0) {
|
||||
// Bug 4 fix: Preserve backup when returning with conflicts
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
mergeConflicts,
|
||||
backupPending: true,
|
||||
untrackedChanges: driftFiles.length > 0 ? driftFiles : undefined,
|
||||
error: `Merge conflicts in: ${mergeConflicts.join(', ')}. Resolve manually then run recordSkillApplication(). Call clearBackup() after resolution or restoreBackup() + clearBackup() to abort.`,
|
||||
};
|
||||
}
|
||||
|
||||
// --- Structured operations ---
|
||||
if (manifest.structured?.npm_dependencies) {
|
||||
const pkgPath = path.join(projectRoot, 'package.json');
|
||||
mergeNpmDependencies(pkgPath, manifest.structured.npm_dependencies);
|
||||
}
|
||||
|
||||
if (manifest.structured?.env_additions) {
|
||||
const envPath = path.join(projectRoot, '.env.example');
|
||||
mergeEnvAdditions(envPath, manifest.structured.env_additions);
|
||||
}
|
||||
|
||||
if (manifest.structured?.docker_compose_services) {
|
||||
const composePath = path.join(projectRoot, 'docker-compose.yml');
|
||||
mergeDockerComposeServices(
|
||||
composePath,
|
||||
manifest.structured.docker_compose_services,
|
||||
);
|
||||
}
|
||||
|
||||
// Run npm install if dependencies were added
|
||||
if (
|
||||
manifest.structured?.npm_dependencies &&
|
||||
Object.keys(manifest.structured.npm_dependencies).length > 0
|
||||
) {
|
||||
runNpmInstall();
|
||||
}
|
||||
|
||||
// --- Post-apply commands ---
|
||||
if (manifest.post_apply && manifest.post_apply.length > 0) {
|
||||
for (const cmd of manifest.post_apply) {
|
||||
try {
|
||||
execSync(cmd, { stdio: 'pipe', cwd: projectRoot, timeout: 120_000 });
|
||||
} catch (postErr: any) {
|
||||
// Rollback on post_apply failure
|
||||
for (const f of addedFiles) {
|
||||
try {
|
||||
if (fs.existsSync(f)) fs.unlinkSync(f);
|
||||
} catch { /* best effort */ }
|
||||
}
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error: `post_apply command failed: ${cmd} — ${postErr.message}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- Update state ---
|
||||
const fileHashes: Record<string, string> = {};
|
||||
for (const relPath of [...manifest.adds, ...manifest.modifies]) {
|
||||
const resolvedPath = resolvePathRemap(relPath, pathRemap);
|
||||
const absPath = path.join(projectRoot, resolvedPath);
|
||||
if (fs.existsSync(absPath)) {
|
||||
fileHashes[resolvedPath] = computeFileHash(absPath);
|
||||
}
|
||||
}
|
||||
|
||||
// Store structured outcomes including the test command so applyUpdate() can run them
|
||||
const outcomes: Record<string, unknown> = manifest.structured
|
||||
? { ...manifest.structured }
|
||||
: {};
|
||||
if (manifest.test) {
|
||||
outcomes.test = manifest.test;
|
||||
}
|
||||
|
||||
recordSkillApplication(
|
||||
manifest.skill,
|
||||
manifest.version,
|
||||
fileHashes,
|
||||
Object.keys(outcomes).length > 0 ? outcomes : undefined,
|
||||
);
|
||||
|
||||
// --- Bug 3 fix: Execute test command if defined ---
|
||||
if (manifest.test) {
|
||||
try {
|
||||
execSync(manifest.test, {
|
||||
stdio: 'pipe',
|
||||
cwd: projectRoot,
|
||||
timeout: 120_000,
|
||||
});
|
||||
} catch (testErr: any) {
|
||||
// Tests failed — remove added files, restore backup and undo state
|
||||
for (const f of addedFiles) {
|
||||
try {
|
||||
if (fs.existsSync(f)) fs.unlinkSync(f);
|
||||
} catch { /* best effort */ }
|
||||
}
|
||||
restoreBackup();
|
||||
// Re-read state and remove the skill we just recorded
|
||||
const state = readState();
|
||||
state.applied_skills = state.applied_skills.filter(
|
||||
(s) => s.name !== manifest.skill,
|
||||
);
|
||||
writeState(state);
|
||||
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
error: `Tests failed: ${testErr.message}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// --- Cleanup ---
|
||||
clearBackup();
|
||||
|
||||
return {
|
||||
success: true,
|
||||
skill: manifest.skill,
|
||||
version: manifest.version,
|
||||
untrackedChanges: driftFiles.length > 0 ? driftFiles : undefined,
|
||||
};
|
||||
} catch (err) {
|
||||
// Remove newly added files before restoring backup
|
||||
for (const f of addedFiles) {
|
||||
try {
|
||||
if (fs.existsSync(f)) fs.unlinkSync(f);
|
||||
} catch { /* best effort */ }
|
||||
}
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
throw err;
|
||||
} finally {
|
||||
releaseLock();
|
||||
}
|
||||
}
|
||||
|
||||
65
skills-engine/backup.ts
Normal file
65
skills-engine/backup.ts
Normal file
@@ -0,0 +1,65 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { BACKUP_DIR } from './constants.js';
|
||||
|
||||
const TOMBSTONE_SUFFIX = '.tombstone';
|
||||
|
||||
function getBackupDir(): string {
|
||||
return path.join(process.cwd(), BACKUP_DIR);
|
||||
}
|
||||
|
||||
export function createBackup(filePaths: string[]): void {
|
||||
const backupDir = getBackupDir();
|
||||
fs.mkdirSync(backupDir, { recursive: true });
|
||||
|
||||
for (const filePath of filePaths) {
|
||||
const absPath = path.resolve(filePath);
|
||||
const relativePath = path.relative(process.cwd(), absPath);
|
||||
const backupPath = path.join(backupDir, relativePath);
|
||||
fs.mkdirSync(path.dirname(backupPath), { recursive: true });
|
||||
|
||||
if (fs.existsSync(absPath)) {
|
||||
fs.copyFileSync(absPath, backupPath);
|
||||
} else {
|
||||
// File doesn't exist yet — write a tombstone so restore can delete it
|
||||
fs.writeFileSync(backupPath + TOMBSTONE_SUFFIX, '', 'utf-8');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export function restoreBackup(): void {
|
||||
const backupDir = getBackupDir();
|
||||
if (!fs.existsSync(backupDir)) return;
|
||||
|
||||
const walk = (dir: string) => {
|
||||
for (const entry of fs.readdirSync(dir, { withFileTypes: true })) {
|
||||
const fullPath = path.join(dir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
walk(fullPath);
|
||||
} else if (entry.name.endsWith(TOMBSTONE_SUFFIX)) {
|
||||
// Tombstone: delete the corresponding project file
|
||||
const tombRelPath = path.relative(backupDir, fullPath);
|
||||
const originalRelPath = tombRelPath.slice(0, -TOMBSTONE_SUFFIX.length);
|
||||
const originalPath = path.join(process.cwd(), originalRelPath);
|
||||
if (fs.existsSync(originalPath)) {
|
||||
fs.unlinkSync(originalPath);
|
||||
}
|
||||
} else {
|
||||
const relativePath = path.relative(backupDir, fullPath);
|
||||
const originalPath = path.join(process.cwd(), relativePath);
|
||||
fs.mkdirSync(path.dirname(originalPath), { recursive: true });
|
||||
fs.copyFileSync(fullPath, originalPath);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
walk(backupDir);
|
||||
}
|
||||
|
||||
export function clearBackup(): void {
|
||||
const backupDir = getBackupDir();
|
||||
if (fs.existsSync(backupDir)) {
|
||||
fs.rmSync(backupDir, { recursive: true, force: true });
|
||||
}
|
||||
}
|
||||
9
skills-engine/constants.ts
Normal file
9
skills-engine/constants.ts
Normal file
@@ -0,0 +1,9 @@
|
||||
export const NANOCLAW_DIR = '.nanoclaw';
|
||||
export const STATE_FILE = 'state.yaml';
|
||||
export const BASE_DIR = '.nanoclaw/base';
|
||||
export const BACKUP_DIR = '.nanoclaw/backup';
|
||||
export const LOCK_FILE = '.nanoclaw/lock';
|
||||
export const CUSTOM_DIR = '.nanoclaw/custom';
|
||||
export const RESOLUTIONS_DIR = '.nanoclaw/resolutions';
|
||||
export const SHIPPED_RESOLUTIONS_DIR = '.claude/resolutions';
|
||||
export const SKILLS_SCHEMA_VERSION = '0.1.0';
|
||||
144
skills-engine/customize.ts
Normal file
144
skills-engine/customize.ts
Normal file
@@ -0,0 +1,144 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { parse, stringify } from 'yaml';
|
||||
|
||||
import { BASE_DIR, CUSTOM_DIR } from './constants.js';
|
||||
import { computeFileHash, readState, recordCustomModification } from './state.js';
|
||||
|
||||
interface PendingCustomize {
|
||||
description: string;
|
||||
started_at: string;
|
||||
file_hashes: Record<string, string>;
|
||||
}
|
||||
|
||||
function getPendingPath(): string {
|
||||
return path.join(process.cwd(), CUSTOM_DIR, 'pending.yaml');
|
||||
}
|
||||
|
||||
export function isCustomizeActive(): boolean {
|
||||
return fs.existsSync(getPendingPath());
|
||||
}
|
||||
|
||||
export function startCustomize(description: string): void {
|
||||
if (isCustomizeActive()) {
|
||||
throw new Error(
|
||||
'A customize session is already active. Commit or abort it first.',
|
||||
);
|
||||
}
|
||||
|
||||
const state = readState();
|
||||
|
||||
// Collect all file hashes from applied skills
|
||||
const fileHashes: Record<string, string> = {};
|
||||
for (const skill of state.applied_skills) {
|
||||
for (const [relativePath, hash] of Object.entries(skill.file_hashes)) {
|
||||
fileHashes[relativePath] = hash;
|
||||
}
|
||||
}
|
||||
|
||||
const pending: PendingCustomize = {
|
||||
description,
|
||||
started_at: new Date().toISOString(),
|
||||
file_hashes: fileHashes,
|
||||
};
|
||||
|
||||
const customDir = path.join(process.cwd(), CUSTOM_DIR);
|
||||
fs.mkdirSync(customDir, { recursive: true });
|
||||
fs.writeFileSync(getPendingPath(), stringify(pending), 'utf-8');
|
||||
}
|
||||
|
||||
export function commitCustomize(): void {
|
||||
const pendingPath = getPendingPath();
|
||||
if (!fs.existsSync(pendingPath)) {
|
||||
throw new Error('No active customize session. Run startCustomize() first.');
|
||||
}
|
||||
|
||||
const pending = parse(
|
||||
fs.readFileSync(pendingPath, 'utf-8'),
|
||||
) as PendingCustomize;
|
||||
const cwd = process.cwd();
|
||||
|
||||
// Find files that changed
|
||||
const changedFiles: string[] = [];
|
||||
for (const relativePath of Object.keys(pending.file_hashes)) {
|
||||
const fullPath = path.join(cwd, relativePath);
|
||||
if (!fs.existsSync(fullPath)) {
|
||||
// File was deleted — counts as changed
|
||||
changedFiles.push(relativePath);
|
||||
continue;
|
||||
}
|
||||
const currentHash = computeFileHash(fullPath);
|
||||
if (currentHash !== pending.file_hashes[relativePath]) {
|
||||
changedFiles.push(relativePath);
|
||||
}
|
||||
}
|
||||
|
||||
if (changedFiles.length === 0) {
|
||||
console.log('No files changed during customize session. Nothing to commit.');
|
||||
fs.unlinkSync(pendingPath);
|
||||
return;
|
||||
}
|
||||
|
||||
// Generate unified diff for each changed file
|
||||
const baseDir = path.join(cwd, BASE_DIR);
|
||||
let combinedPatch = '';
|
||||
|
||||
for (const relativePath of changedFiles) {
|
||||
const basePath = path.join(baseDir, relativePath);
|
||||
const currentPath = path.join(cwd, relativePath);
|
||||
|
||||
// Use /dev/null if either side doesn't exist
|
||||
const oldPath = fs.existsSync(basePath) ? basePath : '/dev/null';
|
||||
const newPath = fs.existsSync(currentPath) ? currentPath : '/dev/null';
|
||||
|
||||
try {
|
||||
const diff = execFileSync('diff', ['-ruN', oldPath, newPath], {
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
combinedPatch += diff;
|
||||
} catch (err: unknown) {
|
||||
const execErr = err as { status?: number; stdout?: string };
|
||||
if (execErr.status === 1 && execErr.stdout) {
|
||||
// diff exits 1 when files differ — that's expected
|
||||
combinedPatch += execErr.stdout;
|
||||
} else if (execErr.status === 2) {
|
||||
throw new Error(`diff error for ${relativePath}: diff exited with status 2 (check file permissions or encoding)`);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!combinedPatch.trim()) {
|
||||
console.log('Diff was empty despite hash changes. Nothing to commit.');
|
||||
fs.unlinkSync(pendingPath);
|
||||
return;
|
||||
}
|
||||
|
||||
// Determine sequence number
|
||||
const state = readState();
|
||||
const existingCount = state.custom_modifications?.length ?? 0;
|
||||
const seqNum = String(existingCount + 1).padStart(3, '0');
|
||||
|
||||
// Sanitize description for filename
|
||||
const sanitized = pending.description
|
||||
.toLowerCase()
|
||||
.replace(/[^a-z0-9]+/g, '-')
|
||||
.replace(/^-|-$/g, '');
|
||||
const patchFilename = `${seqNum}-${sanitized}.patch`;
|
||||
const patchRelPath = path.join(CUSTOM_DIR, patchFilename);
|
||||
const patchFullPath = path.join(cwd, patchRelPath);
|
||||
|
||||
fs.writeFileSync(patchFullPath, combinedPatch, 'utf-8');
|
||||
recordCustomModification(pending.description, changedFiles, patchRelPath);
|
||||
fs.unlinkSync(pendingPath);
|
||||
}
|
||||
|
||||
export function abortCustomize(): void {
|
||||
const pendingPath = getPendingPath();
|
||||
if (fs.existsSync(pendingPath)) {
|
||||
fs.unlinkSync(pendingPath);
|
||||
}
|
||||
}
|
||||
126
skills-engine/file-ops.ts
Normal file
126
skills-engine/file-ops.ts
Normal file
@@ -0,0 +1,126 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import type { FileOperation, FileOpsResult } from './types.js';
|
||||
|
||||
function safePath(projectRoot: string, relativePath: string): string | null {
|
||||
const resolved = path.resolve(projectRoot, relativePath);
|
||||
if (!resolved.startsWith(projectRoot + path.sep) && resolved !== projectRoot) {
|
||||
return null;
|
||||
}
|
||||
return resolved;
|
||||
}
|
||||
|
||||
export function executeFileOps(ops: FileOperation[], projectRoot: string): FileOpsResult {
|
||||
const result: FileOpsResult = {
|
||||
success: true,
|
||||
executed: [],
|
||||
warnings: [],
|
||||
errors: [],
|
||||
};
|
||||
|
||||
const root = path.resolve(projectRoot);
|
||||
|
||||
for (const op of ops) {
|
||||
switch (op.type) {
|
||||
case 'rename': {
|
||||
if (!op.from || !op.to) {
|
||||
result.errors.push(`rename: requires 'from' and 'to'`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
const fromPath = safePath(root, op.from);
|
||||
const toPath = safePath(root, op.to);
|
||||
if (!fromPath) {
|
||||
result.errors.push(`rename: path escapes project root: ${op.from}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (!toPath) {
|
||||
result.errors.push(`rename: path escapes project root: ${op.to}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (!fs.existsSync(fromPath)) {
|
||||
result.errors.push(`rename: source does not exist: ${op.from}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (fs.existsSync(toPath)) {
|
||||
result.errors.push(`rename: target already exists: ${op.to}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
fs.renameSync(fromPath, toPath);
|
||||
result.executed.push(op);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'delete': {
|
||||
if (!op.path) {
|
||||
result.errors.push(`delete: requires 'path'`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
const delPath = safePath(root, op.path);
|
||||
if (!delPath) {
|
||||
result.errors.push(`delete: path escapes project root: ${op.path}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (!fs.existsSync(delPath)) {
|
||||
result.warnings.push(`delete: file does not exist (skipped): ${op.path}`);
|
||||
result.executed.push(op);
|
||||
break;
|
||||
}
|
||||
fs.unlinkSync(delPath);
|
||||
result.executed.push(op);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'move': {
|
||||
if (!op.from || !op.to) {
|
||||
result.errors.push(`move: requires 'from' and 'to'`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
const srcPath = safePath(root, op.from);
|
||||
const dstPath = safePath(root, op.to);
|
||||
if (!srcPath) {
|
||||
result.errors.push(`move: path escapes project root: ${op.from}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (!dstPath) {
|
||||
result.errors.push(`move: path escapes project root: ${op.to}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (!fs.existsSync(srcPath)) {
|
||||
result.errors.push(`move: source does not exist: ${op.from}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
if (fs.existsSync(dstPath)) {
|
||||
result.errors.push(`move: target already exists: ${op.to}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
const dstDir = path.dirname(dstPath);
|
||||
if (!fs.existsSync(dstDir)) {
|
||||
fs.mkdirSync(dstDir, { recursive: true });
|
||||
}
|
||||
fs.renameSync(srcPath, dstPath);
|
||||
result.executed.push(op);
|
||||
break;
|
||||
}
|
||||
|
||||
default: {
|
||||
result.errors.push(`unknown operation type: ${(op as FileOperation).type}`);
|
||||
result.success = false;
|
||||
return result;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
21
skills-engine/fs-utils.ts
Normal file
21
skills-engine/fs-utils.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
/**
|
||||
* Recursively copy a directory tree from src to dest.
|
||||
* Creates destination directories as needed.
|
||||
*/
|
||||
export function copyDir(src: string, dest: string): void {
|
||||
for (const entry of fs.readdirSync(src, { withFileTypes: true })) {
|
||||
const srcPath = path.join(src, entry.name);
|
||||
const destPath = path.join(dest, entry.name);
|
||||
|
||||
if (entry.isDirectory()) {
|
||||
fs.mkdirSync(destPath, { recursive: true });
|
||||
copyDir(srcPath, destPath);
|
||||
} else {
|
||||
fs.mkdirSync(path.dirname(destPath), { recursive: true });
|
||||
fs.copyFileSync(srcPath, destPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
85
skills-engine/index.ts
Normal file
85
skills-engine/index.ts
Normal file
@@ -0,0 +1,85 @@
|
||||
export { applySkill } from './apply.js';
|
||||
export { clearBackup, createBackup, restoreBackup } from './backup.js';
|
||||
export {
|
||||
BACKUP_DIR,
|
||||
BASE_DIR,
|
||||
SKILLS_SCHEMA_VERSION,
|
||||
CUSTOM_DIR,
|
||||
LOCK_FILE,
|
||||
NANOCLAW_DIR,
|
||||
RESOLUTIONS_DIR,
|
||||
SHIPPED_RESOLUTIONS_DIR,
|
||||
STATE_FILE,
|
||||
} from './constants.js';
|
||||
export {
|
||||
abortCustomize,
|
||||
commitCustomize,
|
||||
isCustomizeActive,
|
||||
startCustomize,
|
||||
} from './customize.js';
|
||||
export { executeFileOps } from './file-ops.js';
|
||||
export { initNanoclawDir } from './init.js';
|
||||
export { acquireLock, isLocked, releaseLock } from './lock.js';
|
||||
export {
|
||||
checkConflicts,
|
||||
checkCoreVersion,
|
||||
checkDependencies,
|
||||
checkSystemVersion,
|
||||
readManifest,
|
||||
} from './manifest.js';
|
||||
export {
|
||||
cleanupMergeState,
|
||||
isGitRepo,
|
||||
mergeFile,
|
||||
runRerere,
|
||||
setupRerereAdapter,
|
||||
} from './merge.js';
|
||||
export {
|
||||
loadPathRemap,
|
||||
recordPathRemap,
|
||||
resolvePathRemap,
|
||||
} from './path-remap.js';
|
||||
export { rebase } from './rebase.js';
|
||||
export { findSkillDir, replaySkills } from './replay.js';
|
||||
export type { ReplayOptions, ReplayResult } from './replay.js';
|
||||
export { uninstallSkill } from './uninstall.js';
|
||||
export { initSkillsSystem, migrateExisting } from './migrate.js';
|
||||
export {
|
||||
clearAllResolutions,
|
||||
findResolutionDir,
|
||||
loadResolutions,
|
||||
saveResolution,
|
||||
} from './resolution-cache.js';
|
||||
export { applyUpdate, previewUpdate } from './update.js';
|
||||
export {
|
||||
compareSemver,
|
||||
computeFileHash,
|
||||
getAppliedSkills,
|
||||
getCustomModifications,
|
||||
readState,
|
||||
recordCustomModification,
|
||||
recordSkillApplication,
|
||||
writeState,
|
||||
} from './state.js';
|
||||
export {
|
||||
areRangesCompatible,
|
||||
mergeDockerComposeServices,
|
||||
mergeEnvAdditions,
|
||||
mergeNpmDependencies,
|
||||
runNpmInstall,
|
||||
} from './structured.js';
|
||||
export type {
|
||||
AppliedSkill,
|
||||
ApplyResult,
|
||||
CustomModification,
|
||||
FileOpsResult,
|
||||
FileOperation,
|
||||
MergeResult,
|
||||
RebaseResult,
|
||||
ResolutionMeta,
|
||||
SkillManifest,
|
||||
SkillState,
|
||||
UninstallResult,
|
||||
UpdatePreview,
|
||||
UpdateResult,
|
||||
} from './types.js';
|
||||
103
skills-engine/init.ts
Normal file
103
skills-engine/init.ts
Normal file
@@ -0,0 +1,103 @@
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { BACKUP_DIR, BASE_DIR, NANOCLAW_DIR } from './constants.js';
|
||||
import { isGitRepo } from './merge.js';
|
||||
import { writeState } from './state.js';
|
||||
import { SkillState } from './types.js';
|
||||
|
||||
// Top-level paths to include in base snapshot
|
||||
const BASE_INCLUDES = ['src/', 'package.json', '.env.example', 'container/'];
|
||||
|
||||
// Directories/files to always exclude from base snapshot
|
||||
const BASE_EXCLUDES = [
|
||||
'node_modules',
|
||||
'.nanoclaw',
|
||||
'.git',
|
||||
'dist',
|
||||
'data',
|
||||
'groups',
|
||||
'store',
|
||||
'logs',
|
||||
];
|
||||
|
||||
export function initNanoclawDir(): void {
|
||||
const projectRoot = process.cwd();
|
||||
const nanoclawDir = path.join(projectRoot, NANOCLAW_DIR);
|
||||
const baseDir = path.join(projectRoot, BASE_DIR);
|
||||
|
||||
// Create structure
|
||||
fs.mkdirSync(path.join(projectRoot, BACKUP_DIR), { recursive: true });
|
||||
|
||||
// Clean existing base
|
||||
if (fs.existsSync(baseDir)) {
|
||||
fs.rmSync(baseDir, { recursive: true, force: true });
|
||||
}
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
|
||||
// Snapshot all included paths
|
||||
for (const include of BASE_INCLUDES) {
|
||||
const srcPath = path.join(projectRoot, include);
|
||||
if (!fs.existsSync(srcPath)) continue;
|
||||
|
||||
const destPath = path.join(baseDir, include);
|
||||
const stat = fs.statSync(srcPath);
|
||||
|
||||
if (stat.isDirectory()) {
|
||||
copyDirFiltered(srcPath, destPath, BASE_EXCLUDES);
|
||||
} else {
|
||||
fs.mkdirSync(path.dirname(destPath), { recursive: true });
|
||||
fs.copyFileSync(srcPath, destPath);
|
||||
}
|
||||
}
|
||||
|
||||
// Create initial state
|
||||
const coreVersion = getCoreVersion(projectRoot);
|
||||
const initialState: SkillState = {
|
||||
skills_system_version: '0.1.0',
|
||||
core_version: coreVersion,
|
||||
applied_skills: [],
|
||||
};
|
||||
writeState(initialState);
|
||||
|
||||
// Enable git rerere if in a git repo
|
||||
if (isGitRepo()) {
|
||||
try {
|
||||
execSync('git config --local rerere.enabled true', { stdio: 'pipe' });
|
||||
} catch {
|
||||
// Non-fatal
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function copyDirFiltered(
|
||||
src: string,
|
||||
dest: string,
|
||||
excludes: string[],
|
||||
): void {
|
||||
fs.mkdirSync(dest, { recursive: true });
|
||||
|
||||
for (const entry of fs.readdirSync(src, { withFileTypes: true })) {
|
||||
if (excludes.includes(entry.name)) continue;
|
||||
|
||||
const srcPath = path.join(src, entry.name);
|
||||
const destPath = path.join(dest, entry.name);
|
||||
|
||||
if (entry.isDirectory()) {
|
||||
copyDirFiltered(srcPath, destPath, excludes);
|
||||
} else {
|
||||
fs.copyFileSync(srcPath, destPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function getCoreVersion(projectRoot: string): string {
|
||||
try {
|
||||
const pkgPath = path.join(projectRoot, 'package.json');
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8'));
|
||||
return pkg.version || '0.0.0';
|
||||
} catch {
|
||||
return '0.0.0';
|
||||
}
|
||||
}
|
||||
102
skills-engine/lock.ts
Normal file
102
skills-engine/lock.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { LOCK_FILE } from './constants.js';
|
||||
|
||||
const STALE_TIMEOUT_MS = 5 * 60 * 1000; // 5 minutes
|
||||
|
||||
interface LockInfo {
|
||||
pid: number;
|
||||
timestamp: number;
|
||||
}
|
||||
|
||||
function getLockPath(): string {
|
||||
return path.join(process.cwd(), LOCK_FILE);
|
||||
}
|
||||
|
||||
function isStale(lock: LockInfo): boolean {
|
||||
return Date.now() - lock.timestamp > STALE_TIMEOUT_MS;
|
||||
}
|
||||
|
||||
function isProcessAlive(pid: number): boolean {
|
||||
try {
|
||||
process.kill(pid, 0);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
export function acquireLock(): () => void {
|
||||
const lockPath = getLockPath();
|
||||
fs.mkdirSync(path.dirname(lockPath), { recursive: true });
|
||||
|
||||
const lockInfo: LockInfo = { pid: process.pid, timestamp: Date.now() };
|
||||
|
||||
try {
|
||||
// Atomic creation — fails if file already exists
|
||||
fs.writeFileSync(lockPath, JSON.stringify(lockInfo), { flag: 'wx' });
|
||||
return () => releaseLock();
|
||||
} catch {
|
||||
// Lock file exists — check if it's stale or from a dead process
|
||||
try {
|
||||
const existing: LockInfo = JSON.parse(
|
||||
fs.readFileSync(lockPath, 'utf-8'),
|
||||
);
|
||||
if (!isStale(existing) && isProcessAlive(existing.pid)) {
|
||||
throw new Error(
|
||||
`Operation in progress (pid ${existing.pid}, started ${new Date(existing.timestamp).toISOString()}). If this is stale, delete ${LOCK_FILE}`,
|
||||
);
|
||||
}
|
||||
// Stale or dead process — overwrite
|
||||
} catch (err) {
|
||||
if (
|
||||
err instanceof Error &&
|
||||
err.message.startsWith('Operation in progress')
|
||||
) {
|
||||
throw err;
|
||||
}
|
||||
// Corrupt or unreadable — overwrite
|
||||
}
|
||||
|
||||
try { fs.unlinkSync(lockPath); } catch { /* already gone */ }
|
||||
try {
|
||||
fs.writeFileSync(lockPath, JSON.stringify(lockInfo), { flag: 'wx' });
|
||||
} catch {
|
||||
throw new Error('Lock contention: another process acquired the lock. Retry.');
|
||||
}
|
||||
return () => releaseLock();
|
||||
}
|
||||
}
|
||||
|
||||
export function releaseLock(): void {
|
||||
const lockPath = getLockPath();
|
||||
if (fs.existsSync(lockPath)) {
|
||||
try {
|
||||
const lock: LockInfo = JSON.parse(fs.readFileSync(lockPath, 'utf-8'));
|
||||
// Only release our own lock
|
||||
if (lock.pid === process.pid) {
|
||||
fs.unlinkSync(lockPath);
|
||||
}
|
||||
} catch {
|
||||
// Corrupt or missing — safe to remove
|
||||
try {
|
||||
fs.unlinkSync(lockPath);
|
||||
} catch {
|
||||
// Already gone
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export function isLocked(): boolean {
|
||||
const lockPath = getLockPath();
|
||||
if (!fs.existsSync(lockPath)) return false;
|
||||
|
||||
try {
|
||||
const lock: LockInfo = JSON.parse(fs.readFileSync(lockPath, 'utf-8'));
|
||||
return !isStale(lock) && isProcessAlive(lock.pid);
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
99
skills-engine/manifest.ts
Normal file
99
skills-engine/manifest.ts
Normal file
@@ -0,0 +1,99 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { parse } from 'yaml';
|
||||
|
||||
import { SKILLS_SCHEMA_VERSION } from './constants.js';
|
||||
import { getAppliedSkills, readState, compareSemver } from './state.js';
|
||||
import { SkillManifest } from './types.js';
|
||||
|
||||
export function readManifest(skillDir: string): SkillManifest {
|
||||
const manifestPath = path.join(skillDir, 'manifest.yaml');
|
||||
if (!fs.existsSync(manifestPath)) {
|
||||
throw new Error(`Manifest not found: ${manifestPath}`);
|
||||
}
|
||||
|
||||
const content = fs.readFileSync(manifestPath, 'utf-8');
|
||||
const manifest = parse(content) as SkillManifest;
|
||||
|
||||
// Validate required fields
|
||||
const required = [
|
||||
'skill',
|
||||
'version',
|
||||
'core_version',
|
||||
'adds',
|
||||
'modifies',
|
||||
] as const;
|
||||
for (const field of required) {
|
||||
if (manifest[field] === undefined) {
|
||||
throw new Error(`Manifest missing required field: ${field}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Defaults
|
||||
manifest.conflicts = manifest.conflicts || [];
|
||||
manifest.depends = manifest.depends || [];
|
||||
manifest.file_ops = manifest.file_ops || [];
|
||||
|
||||
// Validate paths don't escape project root
|
||||
const allPaths = [...manifest.adds, ...manifest.modifies];
|
||||
for (const p of allPaths) {
|
||||
if (p.includes('..') || path.isAbsolute(p)) {
|
||||
throw new Error(`Invalid path in manifest: ${p} (must be relative without "..")`);
|
||||
}
|
||||
}
|
||||
|
||||
return manifest;
|
||||
}
|
||||
|
||||
export function checkCoreVersion(manifest: SkillManifest): {
|
||||
ok: boolean;
|
||||
warning?: string;
|
||||
} {
|
||||
const state = readState();
|
||||
const cmp = compareSemver(manifest.core_version, state.core_version);
|
||||
if (cmp > 0) {
|
||||
return {
|
||||
ok: true,
|
||||
warning: `Skill targets core ${manifest.core_version} but current core is ${state.core_version}. The merge might still work but there's a compatibility risk.`,
|
||||
};
|
||||
}
|
||||
return { ok: true };
|
||||
}
|
||||
|
||||
export function checkDependencies(manifest: SkillManifest): {
|
||||
ok: boolean;
|
||||
missing: string[];
|
||||
} {
|
||||
const applied = getAppliedSkills();
|
||||
const appliedNames = new Set(applied.map((s) => s.name));
|
||||
const missing = manifest.depends.filter((dep) => !appliedNames.has(dep));
|
||||
return { ok: missing.length === 0, missing };
|
||||
}
|
||||
|
||||
export function checkSystemVersion(manifest: SkillManifest): {
|
||||
ok: boolean;
|
||||
error?: string;
|
||||
} {
|
||||
if (!manifest.min_skills_system_version) {
|
||||
return { ok: true };
|
||||
}
|
||||
const cmp = compareSemver(manifest.min_skills_system_version, SKILLS_SCHEMA_VERSION);
|
||||
if (cmp > 0) {
|
||||
return {
|
||||
ok: false,
|
||||
error: `Skill requires skills system version ${manifest.min_skills_system_version} but current is ${SKILLS_SCHEMA_VERSION}. Update your skills engine.`,
|
||||
};
|
||||
}
|
||||
return { ok: true };
|
||||
}
|
||||
|
||||
export function checkConflicts(manifest: SkillManifest): {
|
||||
ok: boolean;
|
||||
conflicting: string[];
|
||||
} {
|
||||
const applied = getAppliedSkills();
|
||||
const appliedNames = new Set(applied.map((s) => s.name));
|
||||
const conflicting = manifest.conflicts.filter((c) => appliedNames.has(c));
|
||||
return { ok: conflicting.length === 0, conflicting };
|
||||
}
|
||||
150
skills-engine/merge.ts
Normal file
150
skills-engine/merge.ts
Normal file
@@ -0,0 +1,150 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { MergeResult } from './types.js';
|
||||
|
||||
export function isGitRepo(): boolean {
|
||||
try {
|
||||
execSync('git rev-parse --git-dir', { stdio: 'pipe' });
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Run git merge-file to three-way merge files.
|
||||
* Modifies currentPath in-place.
|
||||
* Returns { clean: true, exitCode: 0 } on clean merge,
|
||||
* { clean: false, exitCode: N } on conflict (N = number of conflicts).
|
||||
*/
|
||||
export function mergeFile(
|
||||
currentPath: string,
|
||||
basePath: string,
|
||||
skillPath: string,
|
||||
): MergeResult {
|
||||
try {
|
||||
execFileSync('git', ['merge-file', currentPath, basePath, skillPath], {
|
||||
stdio: 'pipe',
|
||||
});
|
||||
return { clean: true, exitCode: 0 };
|
||||
} catch (err: any) {
|
||||
const exitCode = err.status ?? 1;
|
||||
if (exitCode > 0) {
|
||||
// Positive exit code = number of conflicts
|
||||
return { clean: false, exitCode };
|
||||
}
|
||||
// Negative exit code = error
|
||||
throw new Error(`git merge-file failed: ${err.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Set up unmerged index entries for rerere adapter.
|
||||
* Creates stages 1/2/3 so git rerere can record/resolve conflicts.
|
||||
*/
|
||||
export function setupRerereAdapter(
|
||||
filePath: string,
|
||||
baseContent: string,
|
||||
oursContent: string,
|
||||
theirsContent: string,
|
||||
): void {
|
||||
if (!isGitRepo()) return;
|
||||
|
||||
const gitDir = execSync('git rev-parse --git-dir', {
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
|
||||
// Clean up stale MERGE_HEAD from a previous crash
|
||||
if (fs.existsSync(path.join(gitDir, 'MERGE_HEAD'))) {
|
||||
cleanupMergeState();
|
||||
}
|
||||
|
||||
// Hash objects into git object store
|
||||
const baseHash = execSync('git hash-object -w --stdin', {
|
||||
input: baseContent,
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
const oursHash = execSync('git hash-object -w --stdin', {
|
||||
input: oursContent,
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
const theirsHash = execSync('git hash-object -w --stdin', {
|
||||
input: theirsContent,
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
|
||||
// Create unmerged index entries (stages 1/2/3)
|
||||
const indexInfo = [
|
||||
`100644 ${baseHash} 1\t${filePath}`,
|
||||
`100644 ${oursHash} 2\t${filePath}`,
|
||||
`100644 ${theirsHash} 3\t${filePath}`,
|
||||
].join('\n');
|
||||
|
||||
execSync('git update-index --index-info', {
|
||||
input: indexInfo,
|
||||
stdio: ['pipe', 'pipe', 'pipe'],
|
||||
});
|
||||
|
||||
// Set MERGE_HEAD and MERGE_MSG (required for rerere)
|
||||
const headHash = execSync('git rev-parse HEAD', {
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
fs.writeFileSync(path.join(gitDir, 'MERGE_HEAD'), headHash + '\n');
|
||||
fs.writeFileSync(
|
||||
path.join(gitDir, 'MERGE_MSG'),
|
||||
`Skill merge: ${filePath}\n`,
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Run git rerere to record or auto-resolve conflicts.
|
||||
* When filePath is given, checks that specific file for remaining conflict markers.
|
||||
* Returns true if rerere auto-resolved the conflict.
|
||||
*/
|
||||
export function runRerere(filePath: string): boolean {
|
||||
if (!isGitRepo()) return false;
|
||||
|
||||
try {
|
||||
execSync('git rerere', { stdio: 'pipe' });
|
||||
|
||||
// Check if the specific working tree file still has conflict markers.
|
||||
// rerere resolves the working tree but does NOT update the index,
|
||||
// so checking unmerged index entries would give a false negative.
|
||||
const content = fs.readFileSync(filePath, 'utf-8');
|
||||
return !content.includes('<<<<<<<');
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up merge state after rerere operations.
|
||||
* Pass filePath to only reset that file's index entries (preserving user's staged changes).
|
||||
*/
|
||||
export function cleanupMergeState(filePath?: string): void {
|
||||
if (!isGitRepo()) return;
|
||||
|
||||
const gitDir = execSync('git rev-parse --git-dir', {
|
||||
encoding: 'utf-8',
|
||||
}).trim();
|
||||
|
||||
// Remove merge markers
|
||||
const mergeHead = path.join(gitDir, 'MERGE_HEAD');
|
||||
const mergeMsg = path.join(gitDir, 'MERGE_MSG');
|
||||
if (fs.existsSync(mergeHead)) fs.unlinkSync(mergeHead);
|
||||
if (fs.existsSync(mergeMsg)) fs.unlinkSync(mergeMsg);
|
||||
|
||||
// Reset only the specific file's unmerged index entries to avoid
|
||||
// dropping the user's pre-existing staged changes
|
||||
try {
|
||||
if (filePath) {
|
||||
execFileSync('git', ['reset', '--', filePath], { stdio: 'pipe' });
|
||||
} else {
|
||||
execSync('git reset', { stdio: 'pipe' });
|
||||
}
|
||||
} catch {
|
||||
// May fail if nothing staged
|
||||
}
|
||||
}
|
||||
74
skills-engine/migrate.ts
Normal file
74
skills-engine/migrate.ts
Normal file
@@ -0,0 +1,74 @@
|
||||
import { execFileSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { BASE_DIR, CUSTOM_DIR, NANOCLAW_DIR } from './constants.js';
|
||||
import { initNanoclawDir } from './init.js';
|
||||
import { recordCustomModification } from './state.js';
|
||||
|
||||
export function initSkillsSystem(): void {
|
||||
initNanoclawDir();
|
||||
console.log('Skills system initialized. .nanoclaw/ directory created.');
|
||||
}
|
||||
|
||||
export function migrateExisting(): void {
|
||||
const projectRoot = process.cwd();
|
||||
|
||||
// First, do a fresh init
|
||||
initNanoclawDir();
|
||||
|
||||
// Then, diff current files against base to capture modifications
|
||||
const baseSrcDir = path.join(projectRoot, BASE_DIR, 'src');
|
||||
const srcDir = path.join(projectRoot, 'src');
|
||||
const customDir = path.join(projectRoot, CUSTOM_DIR);
|
||||
const patchRelPath = path.join(CUSTOM_DIR, 'migration.patch');
|
||||
|
||||
try {
|
||||
let diff: string;
|
||||
try {
|
||||
diff = execFileSync('diff', ['-ruN', baseSrcDir, srcDir], {
|
||||
encoding: 'utf-8',
|
||||
maxBuffer: 10 * 1024 * 1024,
|
||||
});
|
||||
} catch (err: unknown) {
|
||||
// diff exits 1 when files differ — that's expected
|
||||
const execErr = err as { status?: number; stdout?: string };
|
||||
if (execErr.status === 1 && execErr.stdout) {
|
||||
diff = execErr.stdout;
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
if (diff.trim()) {
|
||||
fs.mkdirSync(customDir, { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(projectRoot, patchRelPath),
|
||||
diff,
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
// Extract modified file paths from the diff
|
||||
const filesModified = [...diff.matchAll(/^diff -ruN .+ (.+)$/gm)]
|
||||
.map((m) => path.relative(projectRoot, m[1]))
|
||||
.filter((f) => !f.startsWith('.nanoclaw'));
|
||||
|
||||
// Record in state so the patch is visible to the tracking system
|
||||
recordCustomModification(
|
||||
'Pre-skills migration',
|
||||
filesModified,
|
||||
patchRelPath,
|
||||
);
|
||||
|
||||
console.log(
|
||||
'Custom modifications captured in .nanoclaw/custom/migration.patch',
|
||||
);
|
||||
} else {
|
||||
console.log('No custom modifications detected.');
|
||||
}
|
||||
} catch {
|
||||
console.log('Could not generate diff. Continuing with clean base.');
|
||||
}
|
||||
|
||||
console.log('Migration complete. Skills system ready.');
|
||||
}
|
||||
19
skills-engine/path-remap.ts
Normal file
19
skills-engine/path-remap.ts
Normal file
@@ -0,0 +1,19 @@
|
||||
import { readState, writeState } from './state.js';
|
||||
|
||||
export function resolvePathRemap(
|
||||
relPath: string,
|
||||
remap: Record<string, string>,
|
||||
): string {
|
||||
return remap[relPath] ?? relPath;
|
||||
}
|
||||
|
||||
export function loadPathRemap(): Record<string, string> {
|
||||
const state = readState();
|
||||
return state.path_remap ?? {};
|
||||
}
|
||||
|
||||
export function recordPathRemap(remap: Record<string, string>): void {
|
||||
const state = readState();
|
||||
state.path_remap = { ...state.path_remap, ...remap };
|
||||
writeState(state);
|
||||
}
|
||||
293
skills-engine/rebase.ts
Normal file
293
skills-engine/rebase.ts
Normal file
@@ -0,0 +1,293 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import crypto from 'crypto';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
|
||||
import { clearBackup, createBackup, restoreBackup } from './backup.js';
|
||||
import { BASE_DIR, NANOCLAW_DIR } from './constants.js';
|
||||
import { copyDir } from './fs-utils.js';
|
||||
import { acquireLock } from './lock.js';
|
||||
import {
|
||||
cleanupMergeState,
|
||||
isGitRepo,
|
||||
mergeFile,
|
||||
runRerere,
|
||||
setupRerereAdapter,
|
||||
} from './merge.js';
|
||||
import { clearAllResolutions } from './resolution-cache.js';
|
||||
import { computeFileHash, readState, writeState } from './state.js';
|
||||
import type { RebaseResult } from './types.js';
|
||||
|
||||
function walkDir(dir: string, root: string): string[] {
|
||||
const results: string[] = [];
|
||||
if (!fs.existsSync(dir)) return results;
|
||||
|
||||
for (const entry of fs.readdirSync(dir, { withFileTypes: true })) {
|
||||
const fullPath = path.join(dir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
results.push(...walkDir(fullPath, root));
|
||||
} else {
|
||||
results.push(path.relative(root, fullPath));
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
function collectTrackedFiles(
|
||||
state: ReturnType<typeof readState>,
|
||||
): Set<string> {
|
||||
const tracked = new Set<string>();
|
||||
|
||||
for (const skill of state.applied_skills) {
|
||||
for (const relPath of Object.keys(skill.file_hashes)) {
|
||||
tracked.add(relPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (state.custom_modifications) {
|
||||
for (const mod of state.custom_modifications) {
|
||||
for (const relPath of mod.files_modified) {
|
||||
tracked.add(relPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return tracked;
|
||||
}
|
||||
|
||||
export async function rebase(newBasePath?: string): Promise<RebaseResult> {
|
||||
const projectRoot = process.cwd();
|
||||
const state = readState();
|
||||
|
||||
if (state.applied_skills.length === 0) {
|
||||
return {
|
||||
success: false,
|
||||
filesInPatch: 0,
|
||||
error: 'No skills applied. Nothing to rebase.',
|
||||
};
|
||||
}
|
||||
|
||||
const releaseLock = acquireLock();
|
||||
|
||||
try {
|
||||
const trackedFiles = collectTrackedFiles(state);
|
||||
const baseAbsDir = path.join(projectRoot, BASE_DIR);
|
||||
|
||||
// Include base dir files
|
||||
const baseFiles = walkDir(baseAbsDir, baseAbsDir);
|
||||
for (const f of baseFiles) {
|
||||
trackedFiles.add(f);
|
||||
}
|
||||
|
||||
// Backup
|
||||
const filesToBackup: string[] = [];
|
||||
for (const relPath of trackedFiles) {
|
||||
const absPath = path.join(projectRoot, relPath);
|
||||
if (fs.existsSync(absPath)) filesToBackup.push(absPath);
|
||||
const baseFilePath = path.join(baseAbsDir, relPath);
|
||||
if (fs.existsSync(baseFilePath)) filesToBackup.push(baseFilePath);
|
||||
}
|
||||
const stateFilePath = path.join(projectRoot, NANOCLAW_DIR, 'state.yaml');
|
||||
filesToBackup.push(stateFilePath);
|
||||
createBackup(filesToBackup);
|
||||
|
||||
try {
|
||||
// Generate unified diff: base vs working tree (archival record)
|
||||
let combinedPatch = '';
|
||||
let filesInPatch = 0;
|
||||
|
||||
for (const relPath of trackedFiles) {
|
||||
const basePath = path.join(baseAbsDir, relPath);
|
||||
const workingPath = path.join(projectRoot, relPath);
|
||||
|
||||
const oldPath = fs.existsSync(basePath) ? basePath : '/dev/null';
|
||||
const newPath = fs.existsSync(workingPath) ? workingPath : '/dev/null';
|
||||
|
||||
if (oldPath === '/dev/null' && newPath === '/dev/null') continue;
|
||||
|
||||
try {
|
||||
const diff = execFileSync('diff', ['-ruN', oldPath, newPath], {
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
if (diff.trim()) {
|
||||
combinedPatch += diff;
|
||||
filesInPatch++;
|
||||
}
|
||||
} catch (err: unknown) {
|
||||
const execErr = err as { status?: number; stdout?: string };
|
||||
if (execErr.status === 1 && execErr.stdout) {
|
||||
combinedPatch += execErr.stdout;
|
||||
filesInPatch++;
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Save combined patch
|
||||
const patchPath = path.join(
|
||||
projectRoot,
|
||||
NANOCLAW_DIR,
|
||||
'combined.patch',
|
||||
);
|
||||
fs.writeFileSync(patchPath, combinedPatch, 'utf-8');
|
||||
|
||||
if (newBasePath) {
|
||||
// --- Rebase with new base: three-way merge with resolution model ---
|
||||
|
||||
// Save current working tree content before overwriting
|
||||
const savedContent: Record<string, string> = {};
|
||||
for (const relPath of trackedFiles) {
|
||||
const workingPath = path.join(projectRoot, relPath);
|
||||
if (fs.existsSync(workingPath)) {
|
||||
savedContent[relPath] = fs.readFileSync(workingPath, 'utf-8');
|
||||
}
|
||||
}
|
||||
|
||||
const absNewBase = path.resolve(newBasePath);
|
||||
|
||||
// Replace base
|
||||
if (fs.existsSync(baseAbsDir)) {
|
||||
fs.rmSync(baseAbsDir, { recursive: true, force: true });
|
||||
}
|
||||
fs.mkdirSync(baseAbsDir, { recursive: true });
|
||||
copyDir(absNewBase, baseAbsDir);
|
||||
|
||||
// Copy new base to working tree
|
||||
copyDir(absNewBase, projectRoot);
|
||||
|
||||
// Three-way merge per file: new-base ← old-base → saved-working-tree
|
||||
const mergeConflicts: string[] = [];
|
||||
|
||||
for (const relPath of trackedFiles) {
|
||||
const newBaseSrc = path.join(absNewBase, relPath);
|
||||
const currentPath = path.join(projectRoot, relPath);
|
||||
const saved = savedContent[relPath];
|
||||
|
||||
if (!saved) continue; // No working tree content to merge
|
||||
if (!fs.existsSync(newBaseSrc)) {
|
||||
// File only existed in working tree, not in new base — restore it
|
||||
fs.mkdirSync(path.dirname(currentPath), { recursive: true });
|
||||
fs.writeFileSync(currentPath, saved);
|
||||
continue;
|
||||
}
|
||||
|
||||
const newBaseContent = fs.readFileSync(newBaseSrc, 'utf-8');
|
||||
if (newBaseContent === saved) continue; // No diff
|
||||
|
||||
// Find old base content from backup
|
||||
const oldBasePath = path.join(
|
||||
projectRoot,
|
||||
'.nanoclaw',
|
||||
'backup',
|
||||
BASE_DIR,
|
||||
relPath,
|
||||
);
|
||||
if (!fs.existsSync(oldBasePath)) {
|
||||
// No old base — keep saved content
|
||||
fs.writeFileSync(currentPath, saved);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Save "ours" (new base content) before merge overwrites it
|
||||
const oursContent = newBaseContent;
|
||||
|
||||
// Three-way merge: current(new base) ← old-base → saved(modifications)
|
||||
const tmpSaved = path.join(
|
||||
os.tmpdir(),
|
||||
`nanoclaw-rebase-${crypto.randomUUID()}-${path.basename(relPath)}`,
|
||||
);
|
||||
fs.writeFileSync(tmpSaved, saved);
|
||||
|
||||
const result = mergeFile(currentPath, oldBasePath, tmpSaved);
|
||||
fs.unlinkSync(tmpSaved);
|
||||
|
||||
if (!result.clean) {
|
||||
// Try rerere resolution (three-level model)
|
||||
if (isGitRepo()) {
|
||||
const baseContent = fs.readFileSync(oldBasePath, 'utf-8');
|
||||
setupRerereAdapter(relPath, baseContent, oursContent, saved);
|
||||
const autoResolved = runRerere(currentPath);
|
||||
|
||||
if (autoResolved) {
|
||||
execFileSync('git', ['add', relPath], { stdio: 'pipe' });
|
||||
execSync('git rerere', { stdio: 'pipe' });
|
||||
cleanupMergeState(relPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
cleanupMergeState(relPath);
|
||||
}
|
||||
|
||||
// Unresolved — conflict markers remain in working tree
|
||||
mergeConflicts.push(relPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (mergeConflicts.length > 0) {
|
||||
// Return with backup pending for Claude Code / user resolution
|
||||
return {
|
||||
success: false,
|
||||
patchFile: patchPath,
|
||||
filesInPatch,
|
||||
mergeConflicts,
|
||||
backupPending: true,
|
||||
error: `Merge conflicts in: ${mergeConflicts.join(', ')}. Resolve manually then call clearBackup(), or restoreBackup() + clearBackup() to abort.`,
|
||||
};
|
||||
}
|
||||
} else {
|
||||
// --- Rebase without new base: flatten into base ---
|
||||
// Update base to current working tree state (all skills baked in)
|
||||
for (const relPath of trackedFiles) {
|
||||
const workingPath = path.join(projectRoot, relPath);
|
||||
const basePath = path.join(baseAbsDir, relPath);
|
||||
|
||||
if (fs.existsSync(workingPath)) {
|
||||
fs.mkdirSync(path.dirname(basePath), { recursive: true });
|
||||
fs.copyFileSync(workingPath, basePath);
|
||||
} else if (fs.existsSync(basePath)) {
|
||||
// File was removed by skills — remove from base too
|
||||
fs.unlinkSync(basePath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Update state
|
||||
const now = new Date().toISOString();
|
||||
|
||||
for (const skill of state.applied_skills) {
|
||||
const updatedHashes: Record<string, string> = {};
|
||||
for (const relPath of Object.keys(skill.file_hashes)) {
|
||||
const absPath = path.join(projectRoot, relPath);
|
||||
if (fs.existsSync(absPath)) {
|
||||
updatedHashes[relPath] = computeFileHash(absPath);
|
||||
}
|
||||
}
|
||||
skill.file_hashes = updatedHashes;
|
||||
}
|
||||
|
||||
delete state.custom_modifications;
|
||||
state.rebased_at = now;
|
||||
writeState(state);
|
||||
|
||||
// Clear stale resolution cache (base has changed, old resolutions invalid)
|
||||
clearAllResolutions(projectRoot);
|
||||
|
||||
clearBackup();
|
||||
|
||||
return {
|
||||
success: true,
|
||||
patchFile: patchPath,
|
||||
filesInPatch,
|
||||
rebased_at: now,
|
||||
};
|
||||
} catch (err) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
throw err;
|
||||
}
|
||||
} finally {
|
||||
releaseLock();
|
||||
}
|
||||
}
|
||||
309
skills-engine/replay.ts
Normal file
309
skills-engine/replay.ts
Normal file
@@ -0,0 +1,309 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import crypto from 'crypto';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
|
||||
import { BASE_DIR, NANOCLAW_DIR } from './constants.js';
|
||||
import { copyDir } from './fs-utils.js';
|
||||
import { readManifest } from './manifest.js';
|
||||
import {
|
||||
cleanupMergeState,
|
||||
isGitRepo,
|
||||
mergeFile,
|
||||
runRerere,
|
||||
setupRerereAdapter,
|
||||
} from './merge.js';
|
||||
import { loadPathRemap, resolvePathRemap } from './path-remap.js';
|
||||
import { loadResolutions } from './resolution-cache.js';
|
||||
import {
|
||||
mergeDockerComposeServices,
|
||||
mergeEnvAdditions,
|
||||
mergeNpmDependencies,
|
||||
runNpmInstall,
|
||||
} from './structured.js';
|
||||
|
||||
export interface ReplayOptions {
|
||||
skills: string[];
|
||||
skillDirs: Record<string, string>;
|
||||
projectRoot?: string;
|
||||
}
|
||||
|
||||
export interface ReplayResult {
|
||||
success: boolean;
|
||||
perSkill: Record<string, { success: boolean; error?: string }>;
|
||||
mergeConflicts?: string[];
|
||||
error?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Scan .claude/skills/ for a directory whose manifest.yaml has skill: <skillName>.
|
||||
*/
|
||||
export function findSkillDir(
|
||||
skillName: string,
|
||||
projectRoot?: string,
|
||||
): string | null {
|
||||
const root = projectRoot ?? process.cwd();
|
||||
const skillsRoot = path.join(root, '.claude', 'skills');
|
||||
if (!fs.existsSync(skillsRoot)) return null;
|
||||
|
||||
for (const entry of fs.readdirSync(skillsRoot, { withFileTypes: true })) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
const dir = path.join(skillsRoot, entry.name);
|
||||
const manifestPath = path.join(dir, 'manifest.yaml');
|
||||
if (!fs.existsSync(manifestPath)) continue;
|
||||
|
||||
try {
|
||||
const manifest = readManifest(dir);
|
||||
if (manifest.skill === skillName) return dir;
|
||||
} catch {
|
||||
// Skip invalid manifests
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Replay a list of skills from clean base state.
|
||||
* Used by uninstall (replay-without) and rebase.
|
||||
*/
|
||||
export async function replaySkills(
|
||||
options: ReplayOptions,
|
||||
): Promise<ReplayResult> {
|
||||
const projectRoot = options.projectRoot ?? process.cwd();
|
||||
const baseDir = path.join(projectRoot, BASE_DIR);
|
||||
const pathRemap = loadPathRemap();
|
||||
|
||||
const perSkill: Record<string, { success: boolean; error?: string }> = {};
|
||||
const allMergeConflicts: string[] = [];
|
||||
|
||||
// 1. Collect all files touched by any skill in the list
|
||||
const allTouchedFiles = new Set<string>();
|
||||
for (const skillName of options.skills) {
|
||||
const skillDir = options.skillDirs[skillName];
|
||||
if (!skillDir) {
|
||||
perSkill[skillName] = {
|
||||
success: false,
|
||||
error: `Skill directory not found for: ${skillName}`,
|
||||
};
|
||||
return {
|
||||
success: false,
|
||||
perSkill,
|
||||
error: `Missing skill directory for: ${skillName}`,
|
||||
};
|
||||
}
|
||||
|
||||
const manifest = readManifest(skillDir);
|
||||
for (const f of manifest.adds) allTouchedFiles.add(f);
|
||||
for (const f of manifest.modifies) allTouchedFiles.add(f);
|
||||
}
|
||||
|
||||
// 2. Reset touched files to clean base
|
||||
for (const relPath of allTouchedFiles) {
|
||||
const resolvedPath = resolvePathRemap(relPath, pathRemap);
|
||||
const currentPath = path.join(projectRoot, resolvedPath);
|
||||
const basePath = path.join(baseDir, resolvedPath);
|
||||
|
||||
if (fs.existsSync(basePath)) {
|
||||
// Restore from base
|
||||
fs.mkdirSync(path.dirname(currentPath), { recursive: true });
|
||||
fs.copyFileSync(basePath, currentPath);
|
||||
} else if (fs.existsSync(currentPath)) {
|
||||
// Add-only file not in base — remove it
|
||||
fs.unlinkSync(currentPath);
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Load pre-computed resolutions into git's rr-cache before replaying
|
||||
// Pass the last skill's dir — it's the one applied on top, producing conflicts
|
||||
const lastSkillDir = options.skills.length > 0
|
||||
? options.skillDirs[options.skills[options.skills.length - 1]]
|
||||
: undefined;
|
||||
loadResolutions(options.skills, projectRoot, lastSkillDir);
|
||||
|
||||
// Replay each skill in order
|
||||
// Collect structured ops for batch application
|
||||
const allNpmDeps: Record<string, string> = {};
|
||||
const allEnvAdditions: string[] = [];
|
||||
const allDockerServices: Record<string, unknown> = {};
|
||||
let hasNpmDeps = false;
|
||||
|
||||
for (const skillName of options.skills) {
|
||||
const skillDir = options.skillDirs[skillName];
|
||||
try {
|
||||
const manifest = readManifest(skillDir);
|
||||
|
||||
// Execute file_ops
|
||||
if (manifest.file_ops && manifest.file_ops.length > 0) {
|
||||
const { executeFileOps } = await import('./file-ops.js');
|
||||
const fileOpsResult = executeFileOps(manifest.file_ops, projectRoot);
|
||||
if (!fileOpsResult.success) {
|
||||
perSkill[skillName] = {
|
||||
success: false,
|
||||
error: `File operations failed: ${fileOpsResult.errors.join('; ')}`,
|
||||
};
|
||||
return {
|
||||
success: false,
|
||||
perSkill,
|
||||
error: `File ops failed for ${skillName}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Copy add/ files
|
||||
const addDir = path.join(skillDir, 'add');
|
||||
if (fs.existsSync(addDir)) {
|
||||
for (const relPath of manifest.adds) {
|
||||
const resolvedDest = resolvePathRemap(relPath, pathRemap);
|
||||
const destPath = path.join(projectRoot, resolvedDest);
|
||||
const srcPath = path.join(addDir, relPath);
|
||||
if (fs.existsSync(srcPath)) {
|
||||
fs.mkdirSync(path.dirname(destPath), { recursive: true });
|
||||
fs.copyFileSync(srcPath, destPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Three-way merge modify/ files
|
||||
const skillConflicts: string[] = [];
|
||||
|
||||
for (const relPath of manifest.modifies) {
|
||||
const resolvedPath = resolvePathRemap(relPath, pathRemap);
|
||||
const currentPath = path.join(projectRoot, resolvedPath);
|
||||
const basePath = path.join(baseDir, resolvedPath);
|
||||
const skillPath = path.join(skillDir, 'modify', relPath);
|
||||
|
||||
if (!fs.existsSync(skillPath)) {
|
||||
skillConflicts.push(relPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!fs.existsSync(currentPath)) {
|
||||
fs.mkdirSync(path.dirname(currentPath), { recursive: true });
|
||||
fs.copyFileSync(skillPath, currentPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!fs.existsSync(basePath)) {
|
||||
fs.mkdirSync(path.dirname(basePath), { recursive: true });
|
||||
fs.copyFileSync(currentPath, basePath);
|
||||
}
|
||||
|
||||
const oursContent = fs.readFileSync(currentPath, 'utf-8');
|
||||
const tmpCurrent = path.join(
|
||||
os.tmpdir(),
|
||||
`nanoclaw-replay-${crypto.randomUUID()}-${path.basename(relPath)}`,
|
||||
);
|
||||
fs.copyFileSync(currentPath, tmpCurrent);
|
||||
|
||||
const result = mergeFile(tmpCurrent, basePath, skillPath);
|
||||
|
||||
if (result.clean) {
|
||||
fs.copyFileSync(tmpCurrent, currentPath);
|
||||
fs.unlinkSync(tmpCurrent);
|
||||
} else {
|
||||
fs.copyFileSync(tmpCurrent, currentPath);
|
||||
fs.unlinkSync(tmpCurrent);
|
||||
|
||||
if (isGitRepo()) {
|
||||
const baseContent = fs.readFileSync(basePath, 'utf-8');
|
||||
const theirsContent = fs.readFileSync(skillPath, 'utf-8');
|
||||
|
||||
setupRerereAdapter(
|
||||
resolvedPath,
|
||||
baseContent,
|
||||
oursContent,
|
||||
theirsContent,
|
||||
);
|
||||
const autoResolved = runRerere(currentPath);
|
||||
|
||||
if (autoResolved) {
|
||||
execFileSync('git', ['add', resolvedPath], { stdio: 'pipe' });
|
||||
execSync('git rerere', { stdio: 'pipe' });
|
||||
cleanupMergeState(resolvedPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
cleanupMergeState(resolvedPath);
|
||||
}
|
||||
|
||||
skillConflicts.push(resolvedPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (skillConflicts.length > 0) {
|
||||
allMergeConflicts.push(...skillConflicts);
|
||||
perSkill[skillName] = {
|
||||
success: false,
|
||||
error: `Merge conflicts: ${skillConflicts.join(', ')}`,
|
||||
};
|
||||
// Stop on first conflict — later skills would merge against conflict markers
|
||||
break;
|
||||
} else {
|
||||
perSkill[skillName] = { success: true };
|
||||
}
|
||||
|
||||
// Collect structured ops
|
||||
if (manifest.structured?.npm_dependencies) {
|
||||
Object.assign(allNpmDeps, manifest.structured.npm_dependencies);
|
||||
hasNpmDeps = true;
|
||||
}
|
||||
if (manifest.structured?.env_additions) {
|
||||
allEnvAdditions.push(...manifest.structured.env_additions);
|
||||
}
|
||||
if (manifest.structured?.docker_compose_services) {
|
||||
Object.assign(
|
||||
allDockerServices,
|
||||
manifest.structured.docker_compose_services,
|
||||
);
|
||||
}
|
||||
} catch (err) {
|
||||
perSkill[skillName] = {
|
||||
success: false,
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
return {
|
||||
success: false,
|
||||
perSkill,
|
||||
error: `Replay failed for ${skillName}: ${err instanceof Error ? err.message : String(err)}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (allMergeConflicts.length > 0) {
|
||||
return {
|
||||
success: false,
|
||||
perSkill,
|
||||
mergeConflicts: allMergeConflicts,
|
||||
error: `Unresolved merge conflicts: ${allMergeConflicts.join(', ')}`,
|
||||
};
|
||||
}
|
||||
|
||||
// 4. Apply aggregated structured operations (only if no conflicts)
|
||||
if (hasNpmDeps) {
|
||||
const pkgPath = path.join(projectRoot, 'package.json');
|
||||
mergeNpmDependencies(pkgPath, allNpmDeps);
|
||||
}
|
||||
|
||||
if (allEnvAdditions.length > 0) {
|
||||
const envPath = path.join(projectRoot, '.env.example');
|
||||
mergeEnvAdditions(envPath, allEnvAdditions);
|
||||
}
|
||||
|
||||
if (Object.keys(allDockerServices).length > 0) {
|
||||
const composePath = path.join(projectRoot, 'docker-compose.yml');
|
||||
mergeDockerComposeServices(composePath, allDockerServices);
|
||||
}
|
||||
|
||||
// 5. Run npm install if any deps
|
||||
if (hasNpmDeps) {
|
||||
try {
|
||||
runNpmInstall();
|
||||
} catch {
|
||||
// npm install failure is non-fatal for replay
|
||||
}
|
||||
}
|
||||
|
||||
return { success: true, perSkill };
|
||||
}
|
||||
269
skills-engine/resolution-cache.ts
Normal file
269
skills-engine/resolution-cache.ts
Normal file
@@ -0,0 +1,269 @@
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { parse, stringify } from 'yaml';
|
||||
|
||||
import { NANOCLAW_DIR, RESOLUTIONS_DIR, SHIPPED_RESOLUTIONS_DIR } from './constants.js';
|
||||
import { computeFileHash } from './state.js';
|
||||
import { FileInputHashes, ResolutionMeta } from './types.js';
|
||||
|
||||
/**
|
||||
* Build the resolution directory key from a set of skill identifiers.
|
||||
* Skills are sorted alphabetically and joined with "+".
|
||||
*/
|
||||
function resolutionKey(skills: string[]): string {
|
||||
return [...skills].sort().join('+');
|
||||
}
|
||||
|
||||
/**
|
||||
* Find the resolution directory for a given skill combination.
|
||||
* Returns absolute path if it exists, null otherwise.
|
||||
*/
|
||||
export function findResolutionDir(
|
||||
skills: string[],
|
||||
projectRoot: string,
|
||||
): string | null {
|
||||
const key = resolutionKey(skills);
|
||||
|
||||
// Check shipped resolutions (.claude/resolutions/) first, then project-level
|
||||
for (const baseDir of [SHIPPED_RESOLUTIONS_DIR, RESOLUTIONS_DIR]) {
|
||||
const dir = path.join(projectRoot, baseDir, key);
|
||||
if (fs.existsSync(dir)) {
|
||||
return dir;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load cached resolutions into the local git rerere cache.
|
||||
* Verifies file_hashes from meta.yaml match before loading each pair.
|
||||
* Returns true if loaded successfully, false if not found or no pairs loaded.
|
||||
*/
|
||||
export function loadResolutions(
|
||||
skills: string[],
|
||||
projectRoot: string,
|
||||
skillDir: string,
|
||||
): boolean {
|
||||
const resDir = findResolutionDir(skills, projectRoot);
|
||||
if (!resDir) return false;
|
||||
|
||||
const metaPath = path.join(resDir, 'meta.yaml');
|
||||
if (!fs.existsSync(metaPath)) return false;
|
||||
|
||||
let meta: ResolutionMeta;
|
||||
try {
|
||||
meta = parse(fs.readFileSync(metaPath, 'utf-8')) as ResolutionMeta;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!meta.input_hashes) return false;
|
||||
|
||||
// Find all preimage/resolution pairs
|
||||
const pairs = findPreimagePairs(resDir, resDir);
|
||||
if (pairs.length === 0) return false;
|
||||
|
||||
// Get the git directory
|
||||
let gitDir: string;
|
||||
try {
|
||||
gitDir = execSync('git rev-parse --git-dir', {
|
||||
encoding: 'utf-8',
|
||||
cwd: projectRoot,
|
||||
}).trim();
|
||||
if (!path.isAbsolute(gitDir)) {
|
||||
gitDir = path.join(projectRoot, gitDir);
|
||||
}
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
|
||||
const rrCacheDir = path.join(gitDir, 'rr-cache');
|
||||
let loadedAny = false;
|
||||
|
||||
for (const { relPath, preimage, resolution } of pairs) {
|
||||
// Verify file_hashes — skip pair if hashes don't match
|
||||
const expected = meta.file_hashes?.[relPath];
|
||||
if (!expected) {
|
||||
console.log(`resolution-cache: skipping ${relPath} — no file_hashes in meta`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const basePath = path.join(projectRoot, NANOCLAW_DIR, 'base', relPath);
|
||||
const currentPath = path.join(projectRoot, relPath);
|
||||
const skillModifyPath = path.join(skillDir, 'modify', relPath);
|
||||
|
||||
if (!fs.existsSync(basePath) || !fs.existsSync(currentPath) || !fs.existsSync(skillModifyPath)) {
|
||||
console.log(`resolution-cache: skipping ${relPath} — input files not found`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const baseHash = computeFileHash(basePath);
|
||||
if (baseHash !== expected.base) {
|
||||
console.log(`resolution-cache: skipping ${relPath} — base hash mismatch`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const currentHash = computeFileHash(currentPath);
|
||||
if (currentHash !== expected.current) {
|
||||
console.log(`resolution-cache: skipping ${relPath} — current hash mismatch`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const skillHash = computeFileHash(skillModifyPath);
|
||||
if (skillHash !== expected.skill) {
|
||||
console.log(`resolution-cache: skipping ${relPath} — skill hash mismatch`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const preimageContent = fs.readFileSync(preimage, 'utf-8');
|
||||
const resolutionContent = fs.readFileSync(resolution, 'utf-8');
|
||||
|
||||
// Git rerere uses its own internal hash format (not git hash-object).
|
||||
// We store the rerere hash in the preimage filename as a .hash sidecar,
|
||||
// captured when saveResolution() reads the actual rr-cache after rerere records it.
|
||||
const hashSidecar = preimage + '.hash';
|
||||
if (!fs.existsSync(hashSidecar)) {
|
||||
// No hash recorded — skip this pair (legacy format)
|
||||
continue;
|
||||
}
|
||||
const hash = fs.readFileSync(hashSidecar, 'utf-8').trim();
|
||||
if (!hash) continue;
|
||||
|
||||
// Create rr-cache entry
|
||||
const cacheDir = path.join(rrCacheDir, hash);
|
||||
fs.mkdirSync(cacheDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(cacheDir, 'preimage'), preimageContent);
|
||||
fs.writeFileSync(path.join(cacheDir, 'postimage'), resolutionContent);
|
||||
loadedAny = true;
|
||||
}
|
||||
|
||||
return loadedAny;
|
||||
}
|
||||
|
||||
/**
|
||||
* Save conflict resolutions to the resolution cache.
|
||||
*/
|
||||
export function saveResolution(
|
||||
skills: string[],
|
||||
files: { relPath: string; preimage: string; resolution: string; inputHashes: FileInputHashes }[],
|
||||
meta: Partial<ResolutionMeta>,
|
||||
projectRoot: string,
|
||||
): void {
|
||||
const key = resolutionKey(skills);
|
||||
const resDir = path.join(projectRoot, RESOLUTIONS_DIR, key);
|
||||
|
||||
// Get the git rr-cache directory to find actual rerere hashes
|
||||
let rrCacheDir: string | null = null;
|
||||
try {
|
||||
let gitDir = execSync('git rev-parse --git-dir', {
|
||||
encoding: 'utf-8',
|
||||
cwd: projectRoot,
|
||||
}).trim();
|
||||
if (!path.isAbsolute(gitDir)) {
|
||||
gitDir = path.join(projectRoot, gitDir);
|
||||
}
|
||||
rrCacheDir = path.join(gitDir, 'rr-cache');
|
||||
} catch {
|
||||
// Not a git repo — skip hash capture
|
||||
}
|
||||
|
||||
// Write preimage/resolution pairs
|
||||
for (const file of files) {
|
||||
const preimagePath = path.join(resDir, file.relPath + '.preimage');
|
||||
const resolutionPath = path.join(resDir, file.relPath + '.resolution');
|
||||
|
||||
fs.mkdirSync(path.dirname(preimagePath), { recursive: true });
|
||||
fs.writeFileSync(preimagePath, file.preimage);
|
||||
fs.writeFileSync(resolutionPath, file.resolution);
|
||||
|
||||
// Capture the actual rerere hash by finding the rr-cache entry
|
||||
// whose preimage matches ours
|
||||
if (rrCacheDir && fs.existsSync(rrCacheDir)) {
|
||||
const rerereHash = findRerereHash(rrCacheDir, file.preimage);
|
||||
if (rerereHash) {
|
||||
fs.writeFileSync(preimagePath + '.hash', rerereHash);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Collect file_hashes from individual files
|
||||
const fileHashes: Record<string, FileInputHashes> = {};
|
||||
for (const file of files) {
|
||||
fileHashes[file.relPath] = file.inputHashes;
|
||||
}
|
||||
|
||||
// Build full meta with defaults
|
||||
const fullMeta: ResolutionMeta = {
|
||||
skills: [...skills].sort(),
|
||||
apply_order: meta.apply_order ?? skills,
|
||||
core_version: meta.core_version ?? '',
|
||||
resolved_at: meta.resolved_at ?? new Date().toISOString(),
|
||||
tested: meta.tested ?? false,
|
||||
test_passed: meta.test_passed ?? false,
|
||||
resolution_source: meta.resolution_source ?? 'user',
|
||||
input_hashes: meta.input_hashes ?? {},
|
||||
output_hash: meta.output_hash ?? '',
|
||||
file_hashes: { ...fileHashes, ...meta.file_hashes },
|
||||
};
|
||||
|
||||
fs.writeFileSync(path.join(resDir, 'meta.yaml'), stringify(fullMeta));
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove all resolution cache entries.
|
||||
* Called after rebase since the base has changed and old resolutions are invalid.
|
||||
*/
|
||||
export function clearAllResolutions(projectRoot: string): void {
|
||||
const resDir = path.join(projectRoot, RESOLUTIONS_DIR);
|
||||
if (fs.existsSync(resDir)) {
|
||||
fs.rmSync(resDir, { recursive: true, force: true });
|
||||
fs.mkdirSync(resDir, { recursive: true });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Recursively find preimage/resolution pairs in a directory.
|
||||
*/
|
||||
function findPreimagePairs(
|
||||
dir: string,
|
||||
baseDir: string,
|
||||
): { relPath: string; preimage: string; resolution: string }[] {
|
||||
const pairs: { relPath: string; preimage: string; resolution: string }[] = [];
|
||||
|
||||
for (const entry of fs.readdirSync(dir, { withFileTypes: true })) {
|
||||
const fullPath = path.join(dir, entry.name);
|
||||
|
||||
if (entry.isDirectory()) {
|
||||
pairs.push(...findPreimagePairs(fullPath, baseDir));
|
||||
} else if (entry.name.endsWith('.preimage') && !entry.name.endsWith('.preimage.hash')) {
|
||||
const resolutionPath = fullPath.replace(/\.preimage$/, '.resolution');
|
||||
if (fs.existsSync(resolutionPath)) {
|
||||
const relPath = path.relative(baseDir, fullPath).replace(/\.preimage$/, '');
|
||||
pairs.push({ relPath, preimage: fullPath, resolution: resolutionPath });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return pairs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find the rerere hash for a given preimage by scanning rr-cache entries.
|
||||
* Returns the directory name (hash) whose preimage matches the given content.
|
||||
*/
|
||||
function findRerereHash(rrCacheDir: string, preimageContent: string): string | null {
|
||||
if (!fs.existsSync(rrCacheDir)) return null;
|
||||
|
||||
for (const entry of fs.readdirSync(rrCacheDir, { withFileTypes: true })) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
const preimagePath = path.join(rrCacheDir, entry.name, 'preimage');
|
||||
if (fs.existsSync(preimagePath)) {
|
||||
const content = fs.readFileSync(preimagePath, 'utf-8');
|
||||
if (content === preimageContent) {
|
||||
return entry.name;
|
||||
}
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
115
skills-engine/state.ts
Normal file
115
skills-engine/state.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
import crypto from 'crypto';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { parse, stringify } from 'yaml';
|
||||
|
||||
import { SKILLS_SCHEMA_VERSION, NANOCLAW_DIR, STATE_FILE } from './constants.js';
|
||||
import { AppliedSkill, CustomModification, SkillState } from './types.js';
|
||||
|
||||
function getStatePath(): string {
|
||||
return path.join(process.cwd(), NANOCLAW_DIR, STATE_FILE);
|
||||
}
|
||||
|
||||
export function readState(): SkillState {
|
||||
const statePath = getStatePath();
|
||||
if (!fs.existsSync(statePath)) {
|
||||
throw new Error(
|
||||
'.nanoclaw/state.yaml not found. Run initSkillsSystem() first.',
|
||||
);
|
||||
}
|
||||
const content = fs.readFileSync(statePath, 'utf-8');
|
||||
const state = parse(content) as SkillState;
|
||||
|
||||
if (compareSemver(state.skills_system_version, SKILLS_SCHEMA_VERSION) > 0) {
|
||||
throw new Error(
|
||||
`state.yaml version ${state.skills_system_version} is newer than tooling version ${SKILLS_SCHEMA_VERSION}. Update your skills engine.`,
|
||||
);
|
||||
}
|
||||
|
||||
return state;
|
||||
}
|
||||
|
||||
export function writeState(state: SkillState): void {
|
||||
const statePath = getStatePath();
|
||||
fs.mkdirSync(path.dirname(statePath), { recursive: true });
|
||||
const content = stringify(state, { sortMapEntries: true });
|
||||
// Write to temp file then atomic rename to prevent corruption on crash
|
||||
const tmpPath = statePath + '.tmp';
|
||||
fs.writeFileSync(tmpPath, content, 'utf-8');
|
||||
fs.renameSync(tmpPath, statePath);
|
||||
}
|
||||
|
||||
export function recordSkillApplication(
|
||||
skillName: string,
|
||||
version: string,
|
||||
fileHashes: Record<string, string>,
|
||||
structuredOutcomes?: Record<string, unknown>,
|
||||
): void {
|
||||
const state = readState();
|
||||
|
||||
// Remove previous application of same skill if exists
|
||||
state.applied_skills = state.applied_skills.filter(
|
||||
(s) => s.name !== skillName,
|
||||
);
|
||||
|
||||
state.applied_skills.push({
|
||||
name: skillName,
|
||||
version,
|
||||
applied_at: new Date().toISOString(),
|
||||
file_hashes: fileHashes,
|
||||
structured_outcomes: structuredOutcomes,
|
||||
});
|
||||
|
||||
writeState(state);
|
||||
}
|
||||
|
||||
export function getAppliedSkills(): AppliedSkill[] {
|
||||
const state = readState();
|
||||
return state.applied_skills;
|
||||
}
|
||||
|
||||
export function recordCustomModification(
|
||||
description: string,
|
||||
filesModified: string[],
|
||||
patchFile: string,
|
||||
): void {
|
||||
const state = readState();
|
||||
|
||||
if (!state.custom_modifications) {
|
||||
state.custom_modifications = [];
|
||||
}
|
||||
|
||||
const mod: CustomModification = {
|
||||
description,
|
||||
applied_at: new Date().toISOString(),
|
||||
files_modified: filesModified,
|
||||
patch_file: patchFile,
|
||||
};
|
||||
|
||||
state.custom_modifications.push(mod);
|
||||
writeState(state);
|
||||
}
|
||||
|
||||
export function getCustomModifications(): CustomModification[] {
|
||||
const state = readState();
|
||||
return state.custom_modifications || [];
|
||||
}
|
||||
|
||||
export function computeFileHash(filePath: string): string {
|
||||
const content = fs.readFileSync(filePath);
|
||||
return crypto.createHash('sha256').update(content).digest('hex');
|
||||
}
|
||||
|
||||
/**
|
||||
* Compare two semver strings. Returns negative if a < b, 0 if equal, positive if a > b.
|
||||
*/
|
||||
export function compareSemver(a: string, b: string): number {
|
||||
const partsA = a.split('.').map(Number);
|
||||
const partsB = b.split('.').map(Number);
|
||||
for (let i = 0; i < Math.max(partsA.length, partsB.length); i++) {
|
||||
const diff = (partsA[i] || 0) - (partsB[i] || 0);
|
||||
if (diff !== 0) return diff;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
196
skills-engine/structured.ts
Normal file
196
skills-engine/structured.ts
Normal file
@@ -0,0 +1,196 @@
|
||||
import { execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import { parse, stringify } from 'yaml';
|
||||
|
||||
interface PackageJson {
|
||||
dependencies?: Record<string, string>;
|
||||
devDependencies?: Record<string, string>;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface DockerComposeFile {
|
||||
version?: string;
|
||||
services?: Record<string, unknown>;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
function compareVersionParts(a: string[], b: string[]): number {
|
||||
const len = Math.max(a.length, b.length);
|
||||
for (let i = 0; i < len; i++) {
|
||||
const aNum = parseInt(a[i] ?? '0', 10);
|
||||
const bNum = parseInt(b[i] ?? '0', 10);
|
||||
if (aNum !== bNum) return aNum - bNum;
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
export function areRangesCompatible(
|
||||
existing: string,
|
||||
requested: string,
|
||||
): { compatible: boolean; resolved: string } {
|
||||
if (existing === requested) {
|
||||
return { compatible: true, resolved: existing };
|
||||
}
|
||||
|
||||
// Both start with ^
|
||||
if (existing.startsWith('^') && requested.startsWith('^')) {
|
||||
const eParts = existing.slice(1).split('.');
|
||||
const rParts = requested.slice(1).split('.');
|
||||
if (eParts[0] !== rParts[0]) {
|
||||
return { compatible: false, resolved: existing };
|
||||
}
|
||||
// Same major — take the higher version
|
||||
const resolved =
|
||||
compareVersionParts(eParts, rParts) >= 0 ? existing : requested;
|
||||
return { compatible: true, resolved };
|
||||
}
|
||||
|
||||
// Both start with ~
|
||||
if (existing.startsWith('~') && requested.startsWith('~')) {
|
||||
const eParts = existing.slice(1).split('.');
|
||||
const rParts = requested.slice(1).split('.');
|
||||
if (eParts[0] !== rParts[0] || eParts[1] !== rParts[1]) {
|
||||
return { compatible: false, resolved: existing };
|
||||
}
|
||||
// Same major.minor — take higher patch
|
||||
const resolved =
|
||||
compareVersionParts(eParts, rParts) >= 0 ? existing : requested;
|
||||
return { compatible: true, resolved };
|
||||
}
|
||||
|
||||
// Mismatched prefixes or anything else (exact, >=, *, etc.)
|
||||
return { compatible: false, resolved: existing };
|
||||
}
|
||||
|
||||
export function mergeNpmDependencies(
|
||||
packageJsonPath: string,
|
||||
newDeps: Record<string, string>,
|
||||
): void {
|
||||
const content = fs.readFileSync(packageJsonPath, 'utf-8');
|
||||
const pkg: PackageJson = JSON.parse(content);
|
||||
|
||||
pkg.dependencies = pkg.dependencies || {};
|
||||
|
||||
for (const [name, version] of Object.entries(newDeps)) {
|
||||
// Check both dependencies and devDependencies to avoid duplicates
|
||||
const existing = pkg.dependencies[name] ?? pkg.devDependencies?.[name];
|
||||
if (existing && existing !== version) {
|
||||
const result = areRangesCompatible(existing, version);
|
||||
if (!result.compatible) {
|
||||
throw new Error(
|
||||
`Dependency conflict: ${name} is already at ${existing}, skill wants ${version}`,
|
||||
);
|
||||
}
|
||||
pkg.dependencies[name] = result.resolved;
|
||||
} else {
|
||||
pkg.dependencies[name] = version;
|
||||
}
|
||||
}
|
||||
|
||||
// Sort dependencies for deterministic output
|
||||
pkg.dependencies = Object.fromEntries(
|
||||
Object.entries(pkg.dependencies).sort(([a], [b]) => a.localeCompare(b)),
|
||||
);
|
||||
|
||||
if (pkg.devDependencies) {
|
||||
pkg.devDependencies = Object.fromEntries(
|
||||
Object.entries(pkg.devDependencies).sort(([a], [b]) => a.localeCompare(b)),
|
||||
);
|
||||
}
|
||||
|
||||
fs.writeFileSync(
|
||||
packageJsonPath,
|
||||
JSON.stringify(pkg, null, 2) + '\n',
|
||||
'utf-8',
|
||||
);
|
||||
}
|
||||
|
||||
export function mergeEnvAdditions(
|
||||
envExamplePath: string,
|
||||
additions: string[],
|
||||
): void {
|
||||
let content = '';
|
||||
if (fs.existsSync(envExamplePath)) {
|
||||
content = fs.readFileSync(envExamplePath, 'utf-8');
|
||||
}
|
||||
|
||||
const existingVars = new Set<string>();
|
||||
for (const line of content.split('\n')) {
|
||||
const match = line.match(/^([A-Za-z_][A-Za-z0-9_]*)=/);
|
||||
if (match) existingVars.add(match[1]);
|
||||
}
|
||||
|
||||
const newVars = additions.filter((v) => !existingVars.has(v));
|
||||
if (newVars.length === 0) return;
|
||||
|
||||
if (content && !content.endsWith('\n')) content += '\n';
|
||||
content += '\n# Added by skill\n';
|
||||
for (const v of newVars) {
|
||||
content += `${v}=\n`;
|
||||
}
|
||||
|
||||
fs.writeFileSync(envExamplePath, content, 'utf-8');
|
||||
}
|
||||
|
||||
function extractHostPort(portMapping: string): string | null {
|
||||
const str = String(portMapping);
|
||||
const parts = str.split(':');
|
||||
if (parts.length >= 2) {
|
||||
return parts[0];
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
export function mergeDockerComposeServices(
|
||||
composePath: string,
|
||||
services: Record<string, unknown>,
|
||||
): void {
|
||||
let compose: DockerComposeFile;
|
||||
|
||||
if (fs.existsSync(composePath)) {
|
||||
const content = fs.readFileSync(composePath, 'utf-8');
|
||||
compose = (parse(content) as DockerComposeFile) || {};
|
||||
} else {
|
||||
compose = { version: '3' };
|
||||
}
|
||||
|
||||
compose.services = compose.services || {};
|
||||
|
||||
// Collect host ports from existing services
|
||||
const usedPorts = new Set<string>();
|
||||
for (const [, svc] of Object.entries(compose.services)) {
|
||||
const service = svc as Record<string, unknown>;
|
||||
if (Array.isArray(service.ports)) {
|
||||
for (const p of service.ports) {
|
||||
const host = extractHostPort(String(p));
|
||||
if (host) usedPorts.add(host);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add new services, checking for port collisions
|
||||
for (const [name, definition] of Object.entries(services)) {
|
||||
if (compose.services[name]) continue; // skip existing
|
||||
|
||||
const svc = definition as Record<string, unknown>;
|
||||
if (Array.isArray(svc.ports)) {
|
||||
for (const p of svc.ports) {
|
||||
const host = extractHostPort(String(p));
|
||||
if (host && usedPorts.has(host)) {
|
||||
throw new Error(
|
||||
`Port collision: host port ${host} from service "${name}" is already in use`,
|
||||
);
|
||||
}
|
||||
if (host) usedPorts.add(host);
|
||||
}
|
||||
}
|
||||
|
||||
compose.services[name] = definition;
|
||||
}
|
||||
|
||||
fs.writeFileSync(composePath, stringify(compose), 'utf-8');
|
||||
}
|
||||
|
||||
export function runNpmInstall(): void {
|
||||
execSync('npm install', { stdio: 'inherit', cwd: process.cwd() });
|
||||
}
|
||||
16
skills-engine/tsconfig.json
Normal file
16
skills-engine/tsconfig.json
Normal file
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"module": "NodeNext",
|
||||
"moduleResolution": "NodeNext",
|
||||
"lib": ["ES2022"],
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"resolveJsonModule": true,
|
||||
"noEmit": true
|
||||
},
|
||||
"include": ["**/*.ts"],
|
||||
"exclude": ["__tests__"]
|
||||
}
|
||||
134
skills-engine/types.ts
Normal file
134
skills-engine/types.ts
Normal file
@@ -0,0 +1,134 @@
|
||||
export interface SkillManifest {
|
||||
skill: string;
|
||||
version: string;
|
||||
description: string;
|
||||
core_version: string;
|
||||
adds: string[];
|
||||
modifies: string[];
|
||||
structured?: {
|
||||
npm_dependencies?: Record<string, string>;
|
||||
env_additions?: string[];
|
||||
docker_compose_services?: Record<string, unknown>;
|
||||
};
|
||||
file_ops?: FileOperation[];
|
||||
conflicts: string[];
|
||||
depends: string[];
|
||||
test?: string;
|
||||
author?: string;
|
||||
license?: string;
|
||||
min_skills_system_version?: string;
|
||||
tested_with?: string[];
|
||||
post_apply?: string[];
|
||||
}
|
||||
|
||||
export interface SkillState {
|
||||
skills_system_version: string;
|
||||
core_version: string;
|
||||
applied_skills: AppliedSkill[];
|
||||
custom_modifications?: CustomModification[];
|
||||
path_remap?: Record<string, string>;
|
||||
rebased_at?: string;
|
||||
}
|
||||
|
||||
export interface AppliedSkill {
|
||||
name: string;
|
||||
version: string;
|
||||
applied_at: string;
|
||||
file_hashes: Record<string, string>;
|
||||
structured_outcomes?: Record<string, unknown>;
|
||||
custom_patch?: string;
|
||||
custom_patch_description?: string;
|
||||
}
|
||||
|
||||
export interface ApplyResult {
|
||||
success: boolean;
|
||||
skill: string;
|
||||
version: string;
|
||||
mergeConflicts?: string[];
|
||||
backupPending?: boolean;
|
||||
untrackedChanges?: string[];
|
||||
error?: string;
|
||||
}
|
||||
|
||||
export interface MergeResult {
|
||||
clean: boolean;
|
||||
exitCode: number;
|
||||
}
|
||||
|
||||
export interface FileOperation {
|
||||
type: 'rename' | 'delete' | 'move';
|
||||
from?: string;
|
||||
to?: string;
|
||||
path?: string;
|
||||
}
|
||||
|
||||
export interface FileOpsResult {
|
||||
success: boolean;
|
||||
executed: FileOperation[];
|
||||
warnings: string[];
|
||||
errors: string[];
|
||||
}
|
||||
|
||||
export interface CustomModification {
|
||||
description: string;
|
||||
applied_at: string;
|
||||
files_modified: string[];
|
||||
patch_file: string;
|
||||
}
|
||||
|
||||
export interface FileInputHashes {
|
||||
base: string; // SHA-256 of .nanoclaw/base/<relPath>
|
||||
current: string; // SHA-256 of working tree <relPath> before this merge
|
||||
skill: string; // SHA-256 of skill's modify/<relPath>
|
||||
}
|
||||
|
||||
export interface ResolutionMeta {
|
||||
skills: string[];
|
||||
apply_order: string[];
|
||||
core_version: string;
|
||||
resolved_at: string;
|
||||
tested: boolean;
|
||||
test_passed: boolean;
|
||||
resolution_source: 'maintainer' | 'user' | 'claude';
|
||||
input_hashes: Record<string, string>;
|
||||
output_hash: string;
|
||||
file_hashes: Record<string, FileInputHashes>;
|
||||
}
|
||||
|
||||
export interface UpdatePreview {
|
||||
currentVersion: string;
|
||||
newVersion: string;
|
||||
filesChanged: string[];
|
||||
filesDeleted: string[];
|
||||
conflictRisk: string[];
|
||||
customPatchesAtRisk: string[];
|
||||
}
|
||||
|
||||
export interface UpdateResult {
|
||||
success: boolean;
|
||||
previousVersion: string;
|
||||
newVersion: string;
|
||||
mergeConflicts?: string[];
|
||||
backupPending?: boolean;
|
||||
customPatchFailures?: string[];
|
||||
skillReapplyResults?: Record<string, boolean>;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
export interface UninstallResult {
|
||||
success: boolean;
|
||||
skill: string;
|
||||
customPatchWarning?: string;
|
||||
replayResults?: Record<string, boolean>;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
export interface RebaseResult {
|
||||
success: boolean;
|
||||
patchFile?: string;
|
||||
filesInPatch: number;
|
||||
rebased_at?: string;
|
||||
mergeConflicts?: string[];
|
||||
backupPending?: boolean;
|
||||
error?: string;
|
||||
}
|
||||
231
skills-engine/uninstall.ts
Normal file
231
skills-engine/uninstall.ts
Normal file
@@ -0,0 +1,231 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
import { clearBackup, createBackup, restoreBackup } from './backup.js';
|
||||
import { BASE_DIR, NANOCLAW_DIR } from './constants.js';
|
||||
import { acquireLock } from './lock.js';
|
||||
import { loadPathRemap, resolvePathRemap } from './path-remap.js';
|
||||
import { computeFileHash, readState, writeState } from './state.js';
|
||||
import { findSkillDir, replaySkills } from './replay.js';
|
||||
import type { UninstallResult } from './types.js';
|
||||
|
||||
export async function uninstallSkill(
|
||||
skillName: string,
|
||||
): Promise<UninstallResult> {
|
||||
const projectRoot = process.cwd();
|
||||
const state = readState();
|
||||
|
||||
// 1. Block after rebase — skills are baked into base
|
||||
if (state.rebased_at) {
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
error:
|
||||
'Cannot uninstall individual skills after rebase. The base includes all skill modifications. To remove a skill, start from a clean core and re-apply the skills you want.',
|
||||
};
|
||||
}
|
||||
|
||||
// 2. Verify skill exists
|
||||
const skillEntry = state.applied_skills.find((s) => s.name === skillName);
|
||||
if (!skillEntry) {
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
error: `Skill "${skillName}" is not applied.`,
|
||||
};
|
||||
}
|
||||
|
||||
// 3. Check for custom patch — warn but don't block
|
||||
if (skillEntry.custom_patch) {
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
customPatchWarning: `Skill "${skillName}" has a custom patch (${skillEntry.custom_patch_description ?? 'no description'}). Uninstalling will lose these customizations. Re-run with confirmation to proceed.`,
|
||||
};
|
||||
}
|
||||
|
||||
// 4. Acquire lock
|
||||
const releaseLock = acquireLock();
|
||||
|
||||
try {
|
||||
// 4. Backup all files touched by any applied skill
|
||||
const allTouchedFiles = new Set<string>();
|
||||
for (const skill of state.applied_skills) {
|
||||
for (const filePath of Object.keys(skill.file_hashes)) {
|
||||
allTouchedFiles.add(filePath);
|
||||
}
|
||||
}
|
||||
if (state.custom_modifications) {
|
||||
for (const mod of state.custom_modifications) {
|
||||
for (const f of mod.files_modified) {
|
||||
allTouchedFiles.add(f);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const filesToBackup = [...allTouchedFiles].map((f) =>
|
||||
path.join(projectRoot, f),
|
||||
);
|
||||
createBackup(filesToBackup);
|
||||
|
||||
// 5. Build remaining skill list (original order, minus removed)
|
||||
const remainingSkills = state.applied_skills
|
||||
.filter((s) => s.name !== skillName)
|
||||
.map((s) => s.name);
|
||||
|
||||
// 6. Locate all skill dirs
|
||||
const skillDirs: Record<string, string> = {};
|
||||
for (const name of remainingSkills) {
|
||||
const dir = findSkillDir(name, projectRoot);
|
||||
if (!dir) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
error: `Cannot find skill package for "${name}" in .claude/skills/. All remaining skills must be available for replay.`,
|
||||
};
|
||||
}
|
||||
skillDirs[name] = dir;
|
||||
}
|
||||
|
||||
// 7. Reset files exclusive to the removed skill; replaySkills handles the rest
|
||||
const baseDir = path.join(projectRoot, BASE_DIR);
|
||||
const pathRemap = loadPathRemap();
|
||||
|
||||
const remainingSkillFiles = new Set<string>();
|
||||
for (const skill of state.applied_skills) {
|
||||
if (skill.name === skillName) continue;
|
||||
for (const filePath of Object.keys(skill.file_hashes)) {
|
||||
remainingSkillFiles.add(filePath);
|
||||
}
|
||||
}
|
||||
|
||||
const removedSkillFiles = Object.keys(skillEntry.file_hashes);
|
||||
for (const filePath of removedSkillFiles) {
|
||||
if (remainingSkillFiles.has(filePath)) continue; // replaySkills handles it
|
||||
const resolvedPath = resolvePathRemap(filePath, pathRemap);
|
||||
const currentPath = path.join(projectRoot, resolvedPath);
|
||||
const basePath = path.join(baseDir, resolvedPath);
|
||||
|
||||
if (fs.existsSync(basePath)) {
|
||||
fs.mkdirSync(path.dirname(currentPath), { recursive: true });
|
||||
fs.copyFileSync(basePath, currentPath);
|
||||
} else if (fs.existsSync(currentPath)) {
|
||||
// Add-only file not in base — remove
|
||||
fs.unlinkSync(currentPath);
|
||||
}
|
||||
}
|
||||
|
||||
// 8. Replay remaining skills on clean base
|
||||
const replayResult = await replaySkills({
|
||||
skills: remainingSkills,
|
||||
skillDirs,
|
||||
projectRoot,
|
||||
});
|
||||
|
||||
// 9. Check replay result before proceeding
|
||||
if (!replayResult.success) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
error: `Replay failed: ${replayResult.error}`,
|
||||
};
|
||||
}
|
||||
|
||||
// 10. Re-apply standalone custom_modifications
|
||||
if (state.custom_modifications) {
|
||||
for (const mod of state.custom_modifications) {
|
||||
const patchPath = path.join(projectRoot, mod.patch_file);
|
||||
if (fs.existsSync(patchPath)) {
|
||||
try {
|
||||
execFileSync('git', ['apply', '--3way', patchPath], {
|
||||
stdio: 'pipe',
|
||||
cwd: projectRoot,
|
||||
});
|
||||
} catch {
|
||||
// Custom patch failure is non-fatal but noted
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 11. Run skill tests
|
||||
const replayResults: Record<string, boolean> = {};
|
||||
for (const skill of state.applied_skills) {
|
||||
if (skill.name === skillName) continue;
|
||||
const outcomes = skill.structured_outcomes as
|
||||
| Record<string, unknown>
|
||||
| undefined;
|
||||
if (!outcomes?.test) continue;
|
||||
|
||||
try {
|
||||
execSync(outcomes.test as string, {
|
||||
stdio: 'pipe',
|
||||
cwd: projectRoot,
|
||||
timeout: 120_000,
|
||||
});
|
||||
replayResults[skill.name] = true;
|
||||
} catch {
|
||||
replayResults[skill.name] = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Check for test failures
|
||||
const testFailures = Object.entries(replayResults).filter(
|
||||
([, passed]) => !passed,
|
||||
);
|
||||
if (testFailures.length > 0) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
replayResults,
|
||||
error: `Tests failed after uninstall: ${testFailures.map(([n]) => n).join(', ')}`,
|
||||
};
|
||||
}
|
||||
|
||||
// 11. Update state
|
||||
state.applied_skills = state.applied_skills.filter(
|
||||
(s) => s.name !== skillName,
|
||||
);
|
||||
|
||||
// Update file hashes for remaining skills
|
||||
for (const skill of state.applied_skills) {
|
||||
const newHashes: Record<string, string> = {};
|
||||
for (const filePath of Object.keys(skill.file_hashes)) {
|
||||
const absPath = path.join(projectRoot, filePath);
|
||||
if (fs.existsSync(absPath)) {
|
||||
newHashes[filePath] = computeFileHash(absPath);
|
||||
}
|
||||
}
|
||||
skill.file_hashes = newHashes;
|
||||
}
|
||||
|
||||
writeState(state);
|
||||
|
||||
// 12. Cleanup
|
||||
clearBackup();
|
||||
|
||||
return {
|
||||
success: true,
|
||||
skill: skillName,
|
||||
replayResults:
|
||||
Object.keys(replayResults).length > 0 ? replayResults : undefined,
|
||||
};
|
||||
} catch (err) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
skill: skillName,
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
} finally {
|
||||
releaseLock();
|
||||
}
|
||||
}
|
||||
368
skills-engine/update.ts
Normal file
368
skills-engine/update.ts
Normal file
@@ -0,0 +1,368 @@
|
||||
import { execFileSync, execSync } from 'child_process';
|
||||
import crypto from 'crypto';
|
||||
import fs from 'fs';
|
||||
import os from 'os';
|
||||
import path from 'path';
|
||||
|
||||
import { parse as parseYaml } from 'yaml';
|
||||
|
||||
import { clearBackup, createBackup, restoreBackup } from './backup.js';
|
||||
import { BASE_DIR, NANOCLAW_DIR } from './constants.js';
|
||||
import { copyDir } from './fs-utils.js';
|
||||
import { isCustomizeActive } from './customize.js';
|
||||
import { acquireLock } from './lock.js';
|
||||
import {
|
||||
cleanupMergeState,
|
||||
isGitRepo,
|
||||
mergeFile,
|
||||
runRerere,
|
||||
setupRerereAdapter,
|
||||
} from './merge.js';
|
||||
import { recordPathRemap } from './path-remap.js';
|
||||
import { computeFileHash, readState, writeState } from './state.js';
|
||||
import {
|
||||
mergeDockerComposeServices,
|
||||
mergeEnvAdditions,
|
||||
mergeNpmDependencies,
|
||||
runNpmInstall,
|
||||
} from './structured.js';
|
||||
import type { UpdatePreview, UpdateResult } from './types.js';
|
||||
|
||||
function walkDir(dir: string, root?: string): string[] {
|
||||
const rootDir = root ?? dir;
|
||||
const results: string[] = [];
|
||||
for (const entry of fs.readdirSync(dir, { withFileTypes: true })) {
|
||||
const fullPath = path.join(dir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
results.push(...walkDir(fullPath, rootDir));
|
||||
} else {
|
||||
results.push(path.relative(rootDir, fullPath));
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
export function previewUpdate(newCorePath: string): UpdatePreview {
|
||||
const projectRoot = process.cwd();
|
||||
const state = readState();
|
||||
const baseDir = path.join(projectRoot, BASE_DIR);
|
||||
|
||||
// Read new version from package.json in newCorePath
|
||||
const newPkgPath = path.join(newCorePath, 'package.json');
|
||||
let newVersion = 'unknown';
|
||||
if (fs.existsSync(newPkgPath)) {
|
||||
const pkg = JSON.parse(fs.readFileSync(newPkgPath, 'utf-8'));
|
||||
newVersion = pkg.version ?? 'unknown';
|
||||
}
|
||||
|
||||
// Walk all files in newCorePath, compare against base to find changed files
|
||||
const newCoreFiles = walkDir(newCorePath);
|
||||
const filesChanged: string[] = [];
|
||||
const filesDeleted: string[] = [];
|
||||
|
||||
for (const relPath of newCoreFiles) {
|
||||
const basePath = path.join(baseDir, relPath);
|
||||
const newPath = path.join(newCorePath, relPath);
|
||||
|
||||
if (!fs.existsSync(basePath)) {
|
||||
filesChanged.push(relPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
const baseHash = computeFileHash(basePath);
|
||||
const newHash = computeFileHash(newPath);
|
||||
if (baseHash !== newHash) {
|
||||
filesChanged.push(relPath);
|
||||
}
|
||||
}
|
||||
|
||||
// Detect files deleted in the new core (exist in base but not in newCorePath)
|
||||
if (fs.existsSync(baseDir)) {
|
||||
const baseFiles = walkDir(baseDir);
|
||||
const newCoreSet = new Set(newCoreFiles);
|
||||
for (const relPath of baseFiles) {
|
||||
if (!newCoreSet.has(relPath)) {
|
||||
filesDeleted.push(relPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check which changed files have skill overlaps
|
||||
const conflictRisk: string[] = [];
|
||||
const customPatchesAtRisk: string[] = [];
|
||||
|
||||
for (const relPath of filesChanged) {
|
||||
// Check applied skills
|
||||
for (const skill of state.applied_skills) {
|
||||
if (skill.file_hashes[relPath]) {
|
||||
conflictRisk.push(relPath);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Check custom modifications
|
||||
if (state.custom_modifications) {
|
||||
for (const mod of state.custom_modifications) {
|
||||
if (mod.files_modified.includes(relPath)) {
|
||||
customPatchesAtRisk.push(relPath);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
currentVersion: state.core_version,
|
||||
newVersion,
|
||||
filesChanged,
|
||||
filesDeleted,
|
||||
conflictRisk,
|
||||
customPatchesAtRisk,
|
||||
};
|
||||
}
|
||||
|
||||
export async function applyUpdate(newCorePath: string): Promise<UpdateResult> {
|
||||
const projectRoot = process.cwd();
|
||||
const state = readState();
|
||||
const baseDir = path.join(projectRoot, BASE_DIR);
|
||||
|
||||
// --- Pre-flight ---
|
||||
if (isCustomizeActive()) {
|
||||
return {
|
||||
success: false,
|
||||
previousVersion: state.core_version,
|
||||
newVersion: 'unknown',
|
||||
error:
|
||||
'A customize session is active. Run commitCustomize() or abortCustomize() first.',
|
||||
};
|
||||
}
|
||||
|
||||
const releaseLock = acquireLock();
|
||||
|
||||
try {
|
||||
// --- Preview ---
|
||||
const preview = previewUpdate(newCorePath);
|
||||
|
||||
// --- Backup ---
|
||||
const filesToBackup = [
|
||||
...preview.filesChanged.map((f) => path.join(projectRoot, f)),
|
||||
...preview.filesDeleted.map((f) => path.join(projectRoot, f)),
|
||||
];
|
||||
createBackup(filesToBackup);
|
||||
|
||||
// --- Three-way merge ---
|
||||
const mergeConflicts: string[] = [];
|
||||
|
||||
for (const relPath of preview.filesChanged) {
|
||||
const currentPath = path.join(projectRoot, relPath);
|
||||
const basePath = path.join(baseDir, relPath);
|
||||
const newCoreSrcPath = path.join(newCorePath, relPath);
|
||||
|
||||
if (!fs.existsSync(currentPath)) {
|
||||
// File doesn't exist yet — just copy from new core
|
||||
fs.mkdirSync(path.dirname(currentPath), { recursive: true });
|
||||
fs.copyFileSync(newCoreSrcPath, currentPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!fs.existsSync(basePath)) {
|
||||
// No base — use current as base
|
||||
fs.mkdirSync(path.dirname(basePath), { recursive: true });
|
||||
fs.copyFileSync(currentPath, basePath);
|
||||
}
|
||||
|
||||
// Three-way merge: current ← base → newCore
|
||||
// Save current content before merge overwrites it (needed for rerere stage 2 = "ours")
|
||||
const oursContent = fs.readFileSync(currentPath, 'utf-8');
|
||||
const tmpCurrent = path.join(
|
||||
os.tmpdir(),
|
||||
`nanoclaw-update-${crypto.randomUUID()}-${path.basename(relPath)}`,
|
||||
);
|
||||
fs.copyFileSync(currentPath, tmpCurrent);
|
||||
|
||||
const result = mergeFile(tmpCurrent, basePath, newCoreSrcPath);
|
||||
|
||||
if (result.clean) {
|
||||
fs.copyFileSync(tmpCurrent, currentPath);
|
||||
fs.unlinkSync(tmpCurrent);
|
||||
} else {
|
||||
// Copy conflict markers to working tree path before rerere
|
||||
fs.copyFileSync(tmpCurrent, currentPath);
|
||||
fs.unlinkSync(tmpCurrent);
|
||||
|
||||
if (isGitRepo()) {
|
||||
const baseContent = fs.readFileSync(basePath, 'utf-8');
|
||||
const theirsContent = fs.readFileSync(newCoreSrcPath, 'utf-8');
|
||||
|
||||
setupRerereAdapter(relPath, baseContent, oursContent, theirsContent);
|
||||
const autoResolved = runRerere(currentPath);
|
||||
|
||||
if (autoResolved) {
|
||||
execFileSync('git', ['add', relPath], { stdio: 'pipe' });
|
||||
execSync('git rerere', { stdio: 'pipe' });
|
||||
cleanupMergeState(relPath);
|
||||
continue;
|
||||
}
|
||||
|
||||
cleanupMergeState(relPath);
|
||||
}
|
||||
|
||||
mergeConflicts.push(relPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (mergeConflicts.length > 0) {
|
||||
// Preserve backup so user can resolve conflicts manually, then continue
|
||||
// Call clearBackup() after resolution or restoreBackup() + clearBackup() to abort
|
||||
return {
|
||||
success: false,
|
||||
previousVersion: preview.currentVersion,
|
||||
newVersion: preview.newVersion,
|
||||
mergeConflicts,
|
||||
backupPending: true,
|
||||
error: `Unresolved merge conflicts in: ${mergeConflicts.join(', ')}. Resolve manually then call clearBackup(), or restoreBackup() + clearBackup() to abort.`,
|
||||
};
|
||||
}
|
||||
|
||||
// --- Remove deleted files ---
|
||||
for (const relPath of preview.filesDeleted) {
|
||||
const currentPath = path.join(projectRoot, relPath);
|
||||
if (fs.existsSync(currentPath)) {
|
||||
fs.unlinkSync(currentPath);
|
||||
}
|
||||
}
|
||||
|
||||
// --- Re-apply custom patches ---
|
||||
const customPatchFailures: string[] = [];
|
||||
if (state.custom_modifications) {
|
||||
for (const mod of state.custom_modifications) {
|
||||
const patchPath = path.join(projectRoot, mod.patch_file);
|
||||
if (!fs.existsSync(patchPath)) {
|
||||
customPatchFailures.push(
|
||||
`${mod.description}: patch file missing (${mod.patch_file})`,
|
||||
);
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
execFileSync('git', ['apply', '--3way', patchPath], {
|
||||
stdio: 'pipe',
|
||||
cwd: projectRoot,
|
||||
});
|
||||
} catch {
|
||||
customPatchFailures.push(mod.description);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- Record path remaps from update metadata ---
|
||||
const remapFile = path.join(newCorePath, '.nanoclaw-meta', 'path_remap.yaml');
|
||||
if (fs.existsSync(remapFile)) {
|
||||
const remap = parseYaml(fs.readFileSync(remapFile, 'utf-8')) as Record<string, string>;
|
||||
if (remap && typeof remap === 'object') {
|
||||
recordPathRemap(remap);
|
||||
}
|
||||
}
|
||||
|
||||
// --- Update base ---
|
||||
if (fs.existsSync(baseDir)) {
|
||||
fs.rmSync(baseDir, { recursive: true, force: true });
|
||||
}
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
copyDir(newCorePath, baseDir);
|
||||
|
||||
// --- Structured ops: re-apply from all skills ---
|
||||
const allNpmDeps: Record<string, string> = {};
|
||||
const allEnvAdditions: string[] = [];
|
||||
const allDockerServices: Record<string, unknown> = {};
|
||||
let hasNpmDeps = false;
|
||||
|
||||
for (const skill of state.applied_skills) {
|
||||
const outcomes = skill.structured_outcomes as Record<string, unknown> | undefined;
|
||||
if (!outcomes) continue;
|
||||
|
||||
if (outcomes.npm_dependencies) {
|
||||
Object.assign(allNpmDeps, outcomes.npm_dependencies as Record<string, string>);
|
||||
hasNpmDeps = true;
|
||||
}
|
||||
if (outcomes.env_additions) {
|
||||
allEnvAdditions.push(...(outcomes.env_additions as string[]));
|
||||
}
|
||||
if (outcomes.docker_compose_services) {
|
||||
Object.assign(
|
||||
allDockerServices,
|
||||
outcomes.docker_compose_services as Record<string, unknown>,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if (hasNpmDeps) {
|
||||
const pkgPath = path.join(projectRoot, 'package.json');
|
||||
mergeNpmDependencies(pkgPath, allNpmDeps);
|
||||
}
|
||||
|
||||
if (allEnvAdditions.length > 0) {
|
||||
const envPath = path.join(projectRoot, '.env.example');
|
||||
mergeEnvAdditions(envPath, allEnvAdditions);
|
||||
}
|
||||
|
||||
if (Object.keys(allDockerServices).length > 0) {
|
||||
const composePath = path.join(projectRoot, 'docker-compose.yml');
|
||||
mergeDockerComposeServices(composePath, allDockerServices);
|
||||
}
|
||||
|
||||
if (hasNpmDeps) {
|
||||
runNpmInstall();
|
||||
}
|
||||
|
||||
// --- Run tests for each applied skill ---
|
||||
const skillReapplyResults: Record<string, boolean> = {};
|
||||
|
||||
for (const skill of state.applied_skills) {
|
||||
const outcomes = skill.structured_outcomes as Record<string, unknown> | undefined;
|
||||
if (!outcomes?.test) continue;
|
||||
|
||||
const testCmd = outcomes.test as string;
|
||||
try {
|
||||
execSync(testCmd, {
|
||||
stdio: 'pipe',
|
||||
cwd: projectRoot,
|
||||
timeout: 120_000,
|
||||
});
|
||||
skillReapplyResults[skill.name] = true;
|
||||
} catch {
|
||||
skillReapplyResults[skill.name] = false;
|
||||
}
|
||||
}
|
||||
|
||||
// --- Update state ---
|
||||
state.core_version = preview.newVersion;
|
||||
writeState(state);
|
||||
|
||||
// --- Cleanup ---
|
||||
clearBackup();
|
||||
|
||||
return {
|
||||
success: true,
|
||||
previousVersion: preview.currentVersion,
|
||||
newVersion: preview.newVersion,
|
||||
customPatchFailures:
|
||||
customPatchFailures.length > 0 ? customPatchFailures : undefined,
|
||||
skillReapplyResults:
|
||||
Object.keys(skillReapplyResults).length > 0
|
||||
? skillReapplyResults
|
||||
: undefined,
|
||||
};
|
||||
} catch (err) {
|
||||
restoreBackup();
|
||||
clearBackup();
|
||||
return {
|
||||
success: false,
|
||||
previousVersion: state.core_version,
|
||||
newVersion: 'unknown',
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
} finally {
|
||||
releaseLock();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -320,6 +320,9 @@ describe('WhatsAppChannel', () => {
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'registered@g.us',
|
||||
expect.any(String),
|
||||
undefined,
|
||||
'whatsapp',
|
||||
true,
|
||||
);
|
||||
expect(opts.onMessage).toHaveBeenCalledWith(
|
||||
'registered@g.us',
|
||||
@@ -355,6 +358,9 @@ describe('WhatsAppChannel', () => {
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'unregistered@g.us',
|
||||
expect.any(String),
|
||||
undefined,
|
||||
'whatsapp',
|
||||
true,
|
||||
);
|
||||
expect(opts.onMessage).not.toHaveBeenCalled();
|
||||
});
|
||||
@@ -579,6 +585,9 @@ describe('WhatsAppChannel', () => {
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'1234567890@s.whatsapp.net',
|
||||
expect.any(String),
|
||||
undefined,
|
||||
'whatsapp',
|
||||
false,
|
||||
);
|
||||
});
|
||||
|
||||
@@ -605,6 +614,9 @@ describe('WhatsAppChannel', () => {
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'registered@g.us',
|
||||
expect.any(String),
|
||||
undefined,
|
||||
'whatsapp',
|
||||
true,
|
||||
);
|
||||
});
|
||||
|
||||
@@ -631,6 +643,9 @@ describe('WhatsAppChannel', () => {
|
||||
expect(opts.onChatMetadata).toHaveBeenCalledWith(
|
||||
'0000000000@lid',
|
||||
expect.any(String),
|
||||
undefined,
|
||||
'whatsapp',
|
||||
false,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -158,7 +158,8 @@ export class WhatsAppChannel implements Channel {
|
||||
).toISOString();
|
||||
|
||||
// Always notify about chat metadata for group discovery
|
||||
this.opts.onChatMetadata(chatJid, timestamp);
|
||||
const isGroup = chatJid.endsWith('@g.us');
|
||||
this.opts.onChatMetadata(chatJid, timestamp, undefined, 'whatsapp', isGroup);
|
||||
|
||||
// Only deliver full message for registered groups
|
||||
const groups = this.opts.registeredGroups();
|
||||
|
||||
@@ -5,7 +5,10 @@ import { readEnvFile } from './env.js';
|
||||
// Read config values from .env (falls back to process.env).
|
||||
// Secrets are NOT read here — they stay on disk and are loaded only
|
||||
// where needed (container-runner.ts) to avoid leaking to child processes.
|
||||
const envConfig = readEnvFile(['ASSISTANT_NAME', 'ASSISTANT_HAS_OWN_NUMBER']);
|
||||
const envConfig = readEnvFile([
|
||||
'ASSISTANT_NAME',
|
||||
'ASSISTANT_HAS_OWN_NUMBER',
|
||||
]);
|
||||
|
||||
export const ASSISTANT_NAME =
|
||||
process.env.ASSISTANT_NAME || envConfig.ASSISTANT_NAME || 'Andy';
|
||||
|
||||
46
src/db.ts
46
src/db.ts
@@ -12,7 +12,9 @@ function createSchema(database: Database.Database): void {
|
||||
CREATE TABLE IF NOT EXISTS chats (
|
||||
jid TEXT PRIMARY KEY,
|
||||
name TEXT,
|
||||
last_message_time TEXT
|
||||
last_message_time TEXT,
|
||||
channel TEXT,
|
||||
is_group INTEGER DEFAULT 0
|
||||
);
|
||||
CREATE TABLE IF NOT EXISTS messages (
|
||||
id TEXT,
|
||||
@@ -96,6 +98,23 @@ function createSchema(database: Database.Database): void {
|
||||
} catch {
|
||||
/* column already exists */
|
||||
}
|
||||
|
||||
// Add channel and is_group columns if they don't exist (migration for existing DBs)
|
||||
try {
|
||||
database.exec(
|
||||
`ALTER TABLE chats ADD COLUMN channel TEXT`,
|
||||
);
|
||||
database.exec(
|
||||
`ALTER TABLE chats ADD COLUMN is_group INTEGER DEFAULT 0`,
|
||||
);
|
||||
// Backfill from JID patterns
|
||||
database.exec(`UPDATE chats SET channel = 'whatsapp', is_group = 1 WHERE jid LIKE '%@g.us'`);
|
||||
database.exec(`UPDATE chats SET channel = 'whatsapp', is_group = 0 WHERE jid LIKE '%@s.whatsapp.net'`);
|
||||
database.exec(`UPDATE chats SET channel = 'discord', is_group = 1 WHERE jid LIKE 'dc:%'`);
|
||||
database.exec(`UPDATE chats SET channel = 'telegram', is_group = 1 WHERE jid LIKE 'tg:%'`);
|
||||
} catch {
|
||||
/* columns already exist */
|
||||
}
|
||||
}
|
||||
|
||||
export function initDatabase(): void {
|
||||
@@ -123,26 +142,35 @@ export function storeChatMetadata(
|
||||
chatJid: string,
|
||||
timestamp: string,
|
||||
name?: string,
|
||||
channel?: string,
|
||||
isGroup?: boolean,
|
||||
): void {
|
||||
const ch = channel ?? null;
|
||||
const group = isGroup === undefined ? null : isGroup ? 1 : 0;
|
||||
|
||||
if (name) {
|
||||
// Update with name, preserving existing timestamp if newer
|
||||
db.prepare(
|
||||
`
|
||||
INSERT INTO chats (jid, name, last_message_time) VALUES (?, ?, ?)
|
||||
INSERT INTO chats (jid, name, last_message_time, channel, is_group) VALUES (?, ?, ?, ?, ?)
|
||||
ON CONFLICT(jid) DO UPDATE SET
|
||||
name = excluded.name,
|
||||
last_message_time = MAX(last_message_time, excluded.last_message_time)
|
||||
last_message_time = MAX(last_message_time, excluded.last_message_time),
|
||||
channel = COALESCE(excluded.channel, channel),
|
||||
is_group = COALESCE(excluded.is_group, is_group)
|
||||
`,
|
||||
).run(chatJid, name, timestamp);
|
||||
).run(chatJid, name, timestamp, ch, group);
|
||||
} else {
|
||||
// Update timestamp only, preserve existing name if any
|
||||
db.prepare(
|
||||
`
|
||||
INSERT INTO chats (jid, name, last_message_time) VALUES (?, ?, ?)
|
||||
INSERT INTO chats (jid, name, last_message_time, channel, is_group) VALUES (?, ?, ?, ?, ?)
|
||||
ON CONFLICT(jid) DO UPDATE SET
|
||||
last_message_time = MAX(last_message_time, excluded.last_message_time)
|
||||
last_message_time = MAX(last_message_time, excluded.last_message_time),
|
||||
channel = COALESCE(excluded.channel, channel),
|
||||
is_group = COALESCE(excluded.is_group, is_group)
|
||||
`,
|
||||
).run(chatJid, chatJid, timestamp);
|
||||
).run(chatJid, chatJid, timestamp, ch, group);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -164,6 +192,8 @@ export interface ChatInfo {
|
||||
jid: string;
|
||||
name: string;
|
||||
last_message_time: string;
|
||||
channel: string;
|
||||
is_group: number;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -173,7 +203,7 @@ export function getAllChats(): ChatInfo[] {
|
||||
return db
|
||||
.prepare(
|
||||
`
|
||||
SELECT jid, name, last_message_time
|
||||
SELECT jid, name, last_message_time, channel, is_group
|
||||
FROM chats
|
||||
ORDER BY last_message_time DESC
|
||||
`,
|
||||
|
||||
59
src/index.ts
59
src/index.ts
@@ -34,9 +34,9 @@ import {
|
||||
} from './db.js';
|
||||
import { GroupQueue } from './group-queue.js';
|
||||
import { startIpcWatcher } from './ipc.js';
|
||||
import { formatMessages, formatOutbound } from './router.js';
|
||||
import { findChannel, formatMessages, formatOutbound } from './router.js';
|
||||
import { startSchedulerLoop } from './task-scheduler.js';
|
||||
import { NewMessage, RegisteredGroup } from './types.js';
|
||||
import { Channel, NewMessage, RegisteredGroup } from './types.js';
|
||||
import { logger } from './logger.js';
|
||||
|
||||
// Re-export for backwards compatibility during refactor
|
||||
@@ -49,6 +49,7 @@ let lastAgentTimestamp: Record<string, string> = {};
|
||||
let messageLoopRunning = false;
|
||||
|
||||
let whatsapp: WhatsAppChannel;
|
||||
const channels: Channel[] = [];
|
||||
const queue = new GroupQueue();
|
||||
|
||||
function loadState(): void {
|
||||
@@ -99,7 +100,7 @@ export function getAvailableGroups(): import('./container-runner.js').AvailableG
|
||||
const registeredJids = new Set(Object.keys(registeredGroups));
|
||||
|
||||
return chats
|
||||
.filter((c) => c.jid !== '__group_sync__' && c.jid.endsWith('@g.us'))
|
||||
.filter((c) => c.jid !== '__group_sync__' && c.is_group)
|
||||
.map((c) => ({
|
||||
jid: c.jid,
|
||||
name: c.name,
|
||||
@@ -121,6 +122,12 @@ async function processGroupMessages(chatJid: string): Promise<boolean> {
|
||||
const group = registeredGroups[chatJid];
|
||||
if (!group) return true;
|
||||
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) {
|
||||
console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`);
|
||||
return true;
|
||||
}
|
||||
|
||||
const isMainGroup = group.folder === MAIN_GROUP_FOLDER;
|
||||
|
||||
const sinceTimestamp = lastAgentTimestamp[chatJid] || '';
|
||||
@@ -161,7 +168,7 @@ async function processGroupMessages(chatJid: string): Promise<boolean> {
|
||||
}, IDLE_TIMEOUT);
|
||||
};
|
||||
|
||||
await whatsapp.setTyping(chatJid, true);
|
||||
await channel.setTyping?.(chatJid, true);
|
||||
let hadError = false;
|
||||
let outputSentToUser = false;
|
||||
|
||||
@@ -173,7 +180,7 @@ async function processGroupMessages(chatJid: string): Promise<boolean> {
|
||||
const text = raw.replace(/<internal>[\s\S]*?<\/internal>/g, '').trim();
|
||||
logger.info({ group: group.name }, `Agent output: ${raw.slice(0, 200)}`);
|
||||
if (text) {
|
||||
await whatsapp.sendMessage(chatJid, text);
|
||||
await channel.sendMessage(chatJid, text);
|
||||
outputSentToUser = true;
|
||||
}
|
||||
// Only reset idle timer on actual results, not session-update markers (result: null)
|
||||
@@ -185,7 +192,7 @@ async function processGroupMessages(chatJid: string): Promise<boolean> {
|
||||
}
|
||||
});
|
||||
|
||||
await whatsapp.setTyping(chatJid, false);
|
||||
await channel.setTyping?.(chatJid, false);
|
||||
if (idleTimer) clearTimeout(idleTimer);
|
||||
|
||||
if (output === 'error' || hadError) {
|
||||
@@ -320,6 +327,12 @@ async function startMessageLoop(): Promise<void> {
|
||||
const group = registeredGroups[chatJid];
|
||||
if (!group) continue;
|
||||
|
||||
const channel = findChannel(channels, chatJid);
|
||||
if (!channel) {
|
||||
console.log(`Warning: no channel owns JID ${chatJid}, skipping messages`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const isMainGroup = group.folder === MAIN_GROUP_FOLDER;
|
||||
const needsTrigger = !isMainGroup && group.requiresTrigger !== false;
|
||||
|
||||
@@ -353,7 +366,7 @@ async function startMessageLoop(): Promise<void> {
|
||||
messagesToSend[messagesToSend.length - 1].timestamp;
|
||||
saveState();
|
||||
// Show typing indicator while the container processes the piped message
|
||||
whatsapp.setTyping(chatJid, true);
|
||||
channel.setTyping?.(chatJid, true);
|
||||
} else {
|
||||
// No active container — enqueue for a new one
|
||||
queue.enqueueMessageCheck(chatJid);
|
||||
@@ -457,20 +470,23 @@ async function main(): Promise<void> {
|
||||
const shutdown = async (signal: string) => {
|
||||
logger.info({ signal }, 'Shutdown signal received');
|
||||
await queue.shutdown(10000);
|
||||
await whatsapp.disconnect();
|
||||
for (const ch of channels) await ch.disconnect();
|
||||
process.exit(0);
|
||||
};
|
||||
process.on('SIGTERM', () => shutdown('SIGTERM'));
|
||||
process.on('SIGINT', () => shutdown('SIGINT'));
|
||||
|
||||
// Create WhatsApp channel
|
||||
whatsapp = new WhatsAppChannel({
|
||||
onMessage: (chatJid, msg) => storeMessage(msg),
|
||||
onChatMetadata: (chatJid, timestamp) => storeChatMetadata(chatJid, timestamp),
|
||||
// Channel callbacks (shared by all channels)
|
||||
const channelOpts = {
|
||||
onMessage: (_chatJid: string, msg: NewMessage) => storeMessage(msg),
|
||||
onChatMetadata: (chatJid: string, timestamp: string, name?: string, channel?: string, isGroup?: boolean) =>
|
||||
storeChatMetadata(chatJid, timestamp, name, channel, isGroup),
|
||||
registeredGroups: () => registeredGroups,
|
||||
});
|
||||
};
|
||||
|
||||
// Connect — resolves when first connected
|
||||
// Create and connect channels
|
||||
whatsapp = new WhatsAppChannel(channelOpts);
|
||||
channels.push(whatsapp);
|
||||
await whatsapp.connect();
|
||||
|
||||
// Start subsystems (independently of connection handler)
|
||||
@@ -480,15 +496,24 @@ async function main(): Promise<void> {
|
||||
queue,
|
||||
onProcess: (groupJid, proc, containerName, groupFolder) => queue.registerProcess(groupJid, proc, containerName, groupFolder),
|
||||
sendMessage: async (jid, rawText) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) {
|
||||
console.log(`Warning: no channel owns JID ${jid}, cannot send message`);
|
||||
return;
|
||||
}
|
||||
const text = formatOutbound(rawText);
|
||||
if (text) await whatsapp.sendMessage(jid, text);
|
||||
if (text) await channel.sendMessage(jid, text);
|
||||
},
|
||||
});
|
||||
startIpcWatcher({
|
||||
sendMessage: (jid, text) => whatsapp.sendMessage(jid, text),
|
||||
sendMessage: (jid, text) => {
|
||||
const channel = findChannel(channels, jid);
|
||||
if (!channel) throw new Error(`No channel for JID: ${jid}`);
|
||||
return channel.sendMessage(jid, text);
|
||||
},
|
||||
registeredGroups: () => registeredGroups,
|
||||
registerGroup,
|
||||
syncGroupMetadata: (force) => whatsapp.syncGroupMetadata(force),
|
||||
syncGroupMetadata: (force) => whatsapp?.syncGroupMetadata(force) ?? Promise.resolve(),
|
||||
getAvailableGroups,
|
||||
writeGroupsSnapshot: (gf, im, ag, rj) => writeGroupsSnapshot(gf, im, ag, rj),
|
||||
});
|
||||
|
||||
@@ -22,30 +22,26 @@ describe('JID ownership patterns', () => {
|
||||
const jid = '12345678@s.whatsapp.net';
|
||||
expect(jid.endsWith('@s.whatsapp.net')).toBe(true);
|
||||
});
|
||||
|
||||
it('unknown JID format: does not match WhatsApp patterns', () => {
|
||||
const jid = 'unknown:12345';
|
||||
expect(jid.endsWith('@g.us')).toBe(false);
|
||||
expect(jid.endsWith('@s.whatsapp.net')).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// --- getAvailableGroups ---
|
||||
|
||||
describe('getAvailableGroups', () => {
|
||||
it('returns only @g.us JIDs', () => {
|
||||
storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1');
|
||||
storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM');
|
||||
storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2');
|
||||
it('returns only groups, excludes DMs', () => {
|
||||
storeChatMetadata('group1@g.us', '2024-01-01T00:00:01.000Z', 'Group 1', 'whatsapp', true);
|
||||
storeChatMetadata('user@s.whatsapp.net', '2024-01-01T00:00:02.000Z', 'User DM', 'whatsapp', false);
|
||||
storeChatMetadata('group2@g.us', '2024-01-01T00:00:03.000Z', 'Group 2', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(2);
|
||||
expect(groups.every((g) => g.jid.endsWith('@g.us'))).toBe(true);
|
||||
expect(groups.map((g) => g.jid)).toContain('group1@g.us');
|
||||
expect(groups.map((g) => g.jid)).toContain('group2@g.us');
|
||||
expect(groups.map((g) => g.jid)).not.toContain('user@s.whatsapp.net');
|
||||
});
|
||||
|
||||
it('excludes __group_sync__ sentinel', () => {
|
||||
storeChatMetadata('__group_sync__', '2024-01-01T00:00:00.000Z');
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group');
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:01.000Z', 'Group', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
@@ -53,8 +49,8 @@ describe('getAvailableGroups', () => {
|
||||
});
|
||||
|
||||
it('marks registered groups correctly', () => {
|
||||
storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered');
|
||||
storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered');
|
||||
storeChatMetadata('reg@g.us', '2024-01-01T00:00:01.000Z', 'Registered', 'whatsapp', true);
|
||||
storeChatMetadata('unreg@g.us', '2024-01-01T00:00:02.000Z', 'Unregistered', 'whatsapp', true);
|
||||
|
||||
_setRegisteredGroups({
|
||||
'reg@g.us': {
|
||||
@@ -74,9 +70,9 @@ describe('getAvailableGroups', () => {
|
||||
});
|
||||
|
||||
it('returns groups ordered by most recent activity', () => {
|
||||
storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old');
|
||||
storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New');
|
||||
storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid');
|
||||
storeChatMetadata('old@g.us', '2024-01-01T00:00:01.000Z', 'Old', 'whatsapp', true);
|
||||
storeChatMetadata('new@g.us', '2024-01-01T00:00:05.000Z', 'New', 'whatsapp', true);
|
||||
storeChatMetadata('mid@g.us', '2024-01-01T00:00:03.000Z', 'Mid', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups[0].jid).toBe('new@g.us');
|
||||
@@ -84,6 +80,19 @@ describe('getAvailableGroups', () => {
|
||||
expect(groups[2].jid).toBe('old@g.us');
|
||||
});
|
||||
|
||||
it('excludes non-group chats regardless of JID format', () => {
|
||||
// Unknown JID format stored without is_group should not appear
|
||||
storeChatMetadata('unknown-format-123', '2024-01-01T00:00:01.000Z', 'Unknown');
|
||||
// Explicitly non-group with unusual JID
|
||||
storeChatMetadata('custom:abc', '2024-01-01T00:00:02.000Z', 'Custom DM', 'custom', false);
|
||||
// A real group for contrast
|
||||
storeChatMetadata('group@g.us', '2024-01-01T00:00:03.000Z', 'Group', 'whatsapp', true);
|
||||
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(groups[0].jid).toBe('group@g.us');
|
||||
});
|
||||
|
||||
it('returns empty array when no chats exist', () => {
|
||||
const groups = getAvailableGroups();
|
||||
expect(groups).toHaveLength(0);
|
||||
|
||||
@@ -95,4 +95,10 @@ export type OnInboundMessage = (chatJid: string, message: NewMessage) => void;
|
||||
// Callback for chat metadata discovery.
|
||||
// name is optional — channels that deliver names inline (Telegram) pass it here;
|
||||
// channels that sync names separately (WhatsApp syncGroupMetadata) omit it.
|
||||
export type OnChatMetadata = (chatJid: string, timestamp: string, name?: string) => void;
|
||||
export type OnChatMetadata = (
|
||||
chatJid: string,
|
||||
timestamp: string,
|
||||
name?: string,
|
||||
channel?: string,
|
||||
isGroup?: boolean,
|
||||
) => void;
|
||||
|
||||
@@ -2,6 +2,6 @@ import { defineConfig } from 'vitest/config';
|
||||
|
||||
export default defineConfig({
|
||||
test: {
|
||||
include: ['src/**/*.test.ts'],
|
||||
include: ['src/**/*.test.ts', 'skills-engine/**/*.test.ts'],
|
||||
},
|
||||
});
|
||||
|
||||
7
vitest.skills.config.ts
Normal file
7
vitest.skills.config.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
import { defineConfig } from 'vitest/config';
|
||||
|
||||
export default defineConfig({
|
||||
test: {
|
||||
include: ['.claude/skills/**/tests/*.test.ts'],
|
||||
},
|
||||
});
|
||||
Reference in New Issue
Block a user