Compare commits
10 Commits
c3f8b598af
...
ccf76fea95
| Author | SHA1 | Date | |
|---|---|---|---|
| ccf76fea95 | |||
| 85346ce805 | |||
| a34f486d7c | |||
| ea1aff7200 | |||
| afd8b444c7 | |||
| 6aaba810d8 | |||
| 5f76abec8b | |||
| 0905b0302b | |||
| dfe2a5acae | |||
| f5907892bf |
33
.gitignore
vendored
33
.gitignore
vendored
@@ -1,5 +1,38 @@
|
||||
# Dependencies
|
||||
node_modules/
|
||||
|
||||
# Build output
|
||||
dist/
|
||||
|
||||
# Environment
|
||||
.env
|
||||
.env.local
|
||||
.env.*.local
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
pnpm-debug.log*
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# Docker
|
||||
docker-compose.override.yml
|
||||
|
||||
# Claude Code
|
||||
.claude/
|
||||
|
||||
# Prisma
|
||||
prisma/*.db
|
||||
prisma/*.db-journal
|
||||
|
||||
# Test coverage
|
||||
coverage/
|
||||
|
||||
67
CLAUDE.md
Normal file
67
CLAUDE.md
Normal file
@@ -0,0 +1,67 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
Agent Fox is a SaaS product that provides MCP (Model Context Protocol) services for API documentation. It lets LLMs efficiently query OpenAPI docs through multi-level retrieval instead of dumping entire documents into context.
|
||||
|
||||
## Commands
|
||||
|
||||
```bash
|
||||
# Development (requires Docker for PostgreSQL)
|
||||
docker compose -f docker-compose.yml -f docker-compose.dev.yml up --build
|
||||
|
||||
# Or run services individually (requires local PostgreSQL on port 5432)
|
||||
pnpm dev:server # Express API on :3000
|
||||
pnpm dev:mcp # MCP service on :3001
|
||||
pnpm dev:web # Vite dev server on :5173
|
||||
|
||||
# Database
|
||||
pnpm db:generate # Generate Prisma client after schema changes
|
||||
pnpm db:migrate # Create and apply migrations
|
||||
pnpm db:push # Push schema directly (dev only)
|
||||
|
||||
# Build all packages
|
||||
pnpm build
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
pnpm monorepo with 4 packages sharing TypeScript config (`tsconfig.base.json`):
|
||||
|
||||
- **`packages/shared`** — Prisma client + shared types. All other packages depend on this. Must be built first (`tsc`) before server/mcp can run.
|
||||
- **`packages/server`** (port 3000) — Express 5 backend API. JWT auth, project CRUD, OpenAPI import/parsing, module/endpoint management.
|
||||
- **`packages/mcp`** (port 3001) — Independent Express process exposing MCP tools via Streamable HTTP transport. Authenticates with project-level API keys (not user JWTs).
|
||||
- **`packages/web`** (port 5173) — React 19 + Vite + TailwindCSS SPA. Proxies `/api` to server in dev mode.
|
||||
|
||||
### Data Flow
|
||||
|
||||
1. User imports OpenAPI doc (JSON/YAML/URL) via web UI
|
||||
2. Server validates with `@apidevtools/swagger-parser`, dereferences all `$ref`s
|
||||
3. Parses into Module (from tags or path prefixes) and Endpoint records in PostgreSQL
|
||||
4. User gets a project ID + API key
|
||||
5. LLM connects to MCP service at `/mcp/:projectId` with API key
|
||||
6. MCP provides 5 tools for progressive drill-down (overview → modules → endpoints → detail → search)
|
||||
|
||||
### MCP Tools (in `packages/mcp/src/tools/`)
|
||||
|
||||
The 5 tools are designed for minimal token usage per call (~200-2000 tokens each vs 10,000+ for full doc dump):
|
||||
|
||||
- `get_project_overview` — project name, version, module summary
|
||||
- `list_modules` — modules with descriptions
|
||||
- `list_endpoints(moduleId)` — endpoint summaries in a module
|
||||
- `get_endpoint_detail(endpointId)` — full params, request body, responses
|
||||
- `search_endpoints(keyword, moduleId?)` — cross-endpoint keyword search
|
||||
|
||||
### Key Patterns
|
||||
|
||||
- **API responses**: All endpoints return `{ success: boolean, data?: T, error?: { code, message } }`
|
||||
- **Auth**: User auth uses JWT dual-token (15min access + 7d refresh). MCP auth uses project API keys (`afk_` prefix, bcrypt hashed).
|
||||
- **MCP SDK imports**: Use `@modelcontextprotocol/sdk/server/mcp.js` (not `@modelcontextprotocol/server`). Tool registration uses `server.tool(name, description, zodShape, handler)`.
|
||||
- **Swagger 2.0 + OpenAPI 3.x**: Parser handles both. For Swagger 2, body params are converted to requestBody format.
|
||||
- **Docker dev mode**: Server/MCP use `deps` build stage + volume mounts for hot reload. Web uses Vite `build` stage. Shared must be built inside container before server/mcp start.
|
||||
|
||||
### Database (Prisma schema at `prisma/schema.prisma`)
|
||||
|
||||
Core models: User → Project → Module → Endpoint. Project stores full dereferenced OpenAPI spec as JSONB. Module tracks its source (tag/path_prefix/manual). Endpoint stores parameters, requestBody, responses as JSONB.
|
||||
108
README.md
Normal file
108
README.md
Normal file
@@ -0,0 +1,108 @@
|
||||
# Agent Fox
|
||||
|
||||
API Documentation MCP Service — 让 LLM 高效检索 API 文档,而非一次性灌入全部内容。
|
||||
|
||||
## 它是什么
|
||||
|
||||
Agent Fox 是一个面向开发者的 SaaS 产品。导入 OpenAPI / Swagger 文档后,它会生成一个 MCP 服务端点,供 Claude、GPT 等大模型通过多级检索按需获取接口信息,最小化 token 消耗。
|
||||
|
||||
**一次典型检索仅需 ~1,300 tokens,而非全量文档的 10,000+ tokens。**
|
||||
|
||||
## 核心功能
|
||||
|
||||
- **导入 OpenAPI 文档** — 支持 OpenAPI 3.x 和 Swagger 2.0,URL 或文件上传
|
||||
- **自动分组** — 按 tags 或 URL 路径前缀自动归类为模块,支持手动调整
|
||||
- **MCP 多级检索** — 5 个工具逐层深入:概览 → 模块 → 接口列表 → 接口详情 → 搜索
|
||||
- **项目管理** — 多项目、独立 API Key、配置一键复制
|
||||
|
||||
## 技术栈
|
||||
|
||||
| 层级 | 技术 |
|
||||
|------|------|
|
||||
| 前端 | React 19, Vite, TailwindCSS |
|
||||
| 后端 | Express 5, TypeScript, Zod |
|
||||
| MCP | @modelcontextprotocol/sdk, Streamable HTTP |
|
||||
| 数据库 | PostgreSQL 16, Prisma ORM |
|
||||
| 部署 | Docker Compose |
|
||||
|
||||
## 快速开始
|
||||
|
||||
### 前置条件
|
||||
|
||||
- Node.js >= 20
|
||||
- pnpm
|
||||
- Docker & Docker Compose
|
||||
|
||||
### 启动
|
||||
|
||||
```bash
|
||||
# 克隆项目
|
||||
git clone <repo-url> agent-fox
|
||||
cd agent-fox
|
||||
|
||||
# 复制环境变量
|
||||
cp .env.example .env
|
||||
|
||||
# 一键启动(开发模式)
|
||||
docker compose -f docker-compose.yml -f docker-compose.dev.yml up --build
|
||||
|
||||
# 首次运行需要执行数据库迁移
|
||||
DATABASE_URL=postgresql://agentfox:agentfox@localhost:5432/agentfox \
|
||||
npx prisma migrate deploy --schema=prisma/schema.prisma
|
||||
```
|
||||
|
||||
访问 `http://localhost:5173` 使用前端。
|
||||
|
||||
### 生产模式
|
||||
|
||||
```bash
|
||||
docker compose up --build
|
||||
```
|
||||
|
||||
生产模式下 server 容器会自动执行数据库迁移,前端通过 Nginx 在 80 端口提供服务。
|
||||
|
||||
## MCP 接入
|
||||
|
||||
在 Agent Fox 中导入 API 文档后,将生成的配置添加到你的 MCP 客户端:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"my-api": {
|
||||
"type": "http",
|
||||
"url": "http://localhost:3001/mcp/<project-id>",
|
||||
"headers": {
|
||||
"Authorization": "Bearer <api-key>"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
LLM 即可通过以下工具按需检索文档:
|
||||
|
||||
| 工具 | 说明 | ~Tokens |
|
||||
|------|------|---------|
|
||||
| `get_project_overview` | 项目概览 + 模块统计 | 200 |
|
||||
| `list_modules` | 模块列表含描述 | 100-300 |
|
||||
| `list_endpoints` | 模块内接口摘要 | 200-500 |
|
||||
| `get_endpoint_detail` | 完整接口详情 | 500-2000 |
|
||||
| `search_endpoints` | 关键字搜索 | 200-500 |
|
||||
|
||||
## 项目结构
|
||||
|
||||
```
|
||||
agent-fox/
|
||||
├── packages/
|
||||
│ ├── web/ # React 前端
|
||||
│ ├── server/ # Express 后端 API (端口 3000)
|
||||
│ ├── mcp/ # MCP 服务 (端口 3001)
|
||||
│ └── shared/ # Prisma Client + 共享类型
|
||||
├── prisma/ # 数据库 Schema + 迁移
|
||||
├── docker-compose.yml
|
||||
└── docker-compose.dev.yml
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
68
docker-compose.dev.yml
Normal file
68
docker-compose.dev.yml
Normal file
@@ -0,0 +1,68 @@
|
||||
services:
|
||||
postgres:
|
||||
ports:
|
||||
- "5432:5432"
|
||||
|
||||
server:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: packages/server/Dockerfile
|
||||
target: deps
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
command: >
|
||||
sh -c "
|
||||
npx prisma generate --schema=prisma/schema.prisma &&
|
||||
cd /app/packages/shared && npx tsc &&
|
||||
cd /app &&
|
||||
pnpm --filter @agent-fox/server dev
|
||||
"
|
||||
volumes:
|
||||
- ./packages/shared/src:/app/packages/shared/src
|
||||
- ./packages/shared/tsconfig.json:/app/packages/shared/tsconfig.json
|
||||
- ./packages/server/src:/app/packages/server/src
|
||||
- ./prisma:/app/prisma
|
||||
environment:
|
||||
DATABASE_URL: postgresql://agentfox:agentfox@postgres:5432/agentfox
|
||||
JWT_SECRET: dev-secret
|
||||
JWT_REFRESH_SECRET: dev-refresh-secret
|
||||
SERVER_PORT: "3000"
|
||||
NODE_ENV: development
|
||||
|
||||
mcp:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: packages/mcp/Dockerfile
|
||||
target: deps
|
||||
command: >
|
||||
sh -c "
|
||||
npx prisma generate --schema=prisma/schema.prisma &&
|
||||
cd /app/packages/shared && npx tsc &&
|
||||
cd /app &&
|
||||
pnpm --filter @agent-fox/mcp dev
|
||||
"
|
||||
volumes:
|
||||
- ./packages/shared/src:/app/packages/shared/src
|
||||
- ./packages/shared/tsconfig.json:/app/packages/shared/tsconfig.json
|
||||
- ./packages/mcp/src:/app/packages/mcp/src
|
||||
- ./prisma:/app/prisma
|
||||
environment:
|
||||
DATABASE_URL: postgresql://agentfox:agentfox@postgres:5432/agentfox
|
||||
MCP_PORT: "3001"
|
||||
NODE_ENV: development
|
||||
|
||||
web:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: packages/web/Dockerfile
|
||||
target: build
|
||||
command: >
|
||||
sh -c "pnpm --filter @agent-fox/web exec vite --host 0.0.0.0 --port 5173"
|
||||
volumes:
|
||||
- ./packages/web/src:/app/packages/web/src
|
||||
- ./packages/web/index.html:/app/packages/web/index.html
|
||||
ports:
|
||||
- "5173:5173"
|
||||
environment:
|
||||
NODE_ENV: development
|
||||
API_URL: http://server:3000
|
||||
57
docker-compose.yml
Normal file
57
docker-compose.yml
Normal file
@@ -0,0 +1,57 @@
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16-alpine
|
||||
environment:
|
||||
POSTGRES_USER: agentfox
|
||||
POSTGRES_PASSWORD: agentfox
|
||||
POSTGRES_DB: agentfox
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
ports:
|
||||
- "5432:5432"
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U agentfox"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
server:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: packages/server/Dockerfile
|
||||
environment:
|
||||
DATABASE_URL: postgresql://agentfox:agentfox@postgres:5432/agentfox
|
||||
JWT_SECRET: ${JWT_SECRET:-change-me-in-production}
|
||||
JWT_REFRESH_SECRET: ${JWT_REFRESH_SECRET:-change-me-refresh-in-production}
|
||||
SERVER_PORT: "3000"
|
||||
ports:
|
||||
- "3000:3000"
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
mcp:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: packages/mcp/Dockerfile
|
||||
environment:
|
||||
DATABASE_URL: postgresql://agentfox:agentfox@postgres:5432/agentfox
|
||||
MCP_PORT: "3001"
|
||||
ports:
|
||||
- "3001:3001"
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
web:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: packages/web/Dockerfile
|
||||
ports:
|
||||
- "80:80"
|
||||
depends_on:
|
||||
- server
|
||||
- mcp
|
||||
|
||||
volumes:
|
||||
pgdata:
|
||||
3551
docs/superpowers/plans/2026-04-02-agent-fox-implementation.md
Normal file
3551
docs/superpowers/plans/2026-04-02-agent-fox-implementation.md
Normal file
File diff suppressed because it is too large
Load Diff
335
docs/superpowers/specs/2026-04-02-agent-fox-design.md
Normal file
335
docs/superpowers/specs/2026-04-02-agent-fox-design.md
Normal file
@@ -0,0 +1,335 @@
|
||||
# Agent Fox - API Documentation MCP Service
|
||||
|
||||
## Context
|
||||
|
||||
Developers using LLMs (Claude, GPT, etc.) often need to reference API documentation while coding. Currently, they either paste entire API docs into the context (wasting tokens) or manually copy relevant sections. Agent Fox solves this by providing an MCP service that lets LLMs efficiently query API documentation through multi-level retrieval, minimizing token consumption while maximizing usefulness.
|
||||
|
||||
**Target**: Developer-facing SaaS product with user authentication and multi-tenant isolation.
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Monorepo Structure (pnpm workspace)
|
||||
|
||||
```
|
||||
agent-fox/
|
||||
├── packages/
|
||||
│ ├── web/ # React frontend (Vite + TailwindCSS + shadcn/ui)
|
||||
│ ├── server/ # Express backend API
|
||||
│ ├── mcp/ # MCP service (independent Express process)
|
||||
│ └── shared/ # Shared types + Prisma client
|
||||
├── prisma/ # Prisma schema + migrations
|
||||
├── package.json
|
||||
├── pnpm-workspace.yaml
|
||||
└── tsconfig.base.json
|
||||
```
|
||||
|
||||
- `server` and `mcp` are independently deployable processes sharing the same PostgreSQL database via the `shared` Prisma client.
|
||||
- `web` is a static SPA served separately (or via CDN).
|
||||
|
||||
### Tech Stack
|
||||
|
||||
| Layer | Technology |
|
||||
|-------|-----------|
|
||||
| Frontend | React 19 + Vite + TailwindCSS + shadcn/ui + React Router + TanStack Query |
|
||||
| Backend API | Express + TypeScript + Zod (validation) |
|
||||
| MCP Service | `@modelcontextprotocol/server` + `@modelcontextprotocol/express` + `@modelcontextprotocol/node` |
|
||||
| Database | PostgreSQL + Prisma ORM |
|
||||
| OpenAPI Parsing | `@apidevtools/swagger-parser` |
|
||||
| Auth | JWT (access + refresh) + bcrypt + Passport.js (GitHub/Google OAuth) |
|
||||
| Language | TypeScript throughout |
|
||||
|
||||
## Data Model
|
||||
|
||||
### User
|
||||
| Field | Type | Notes |
|
||||
|-------|------|-------|
|
||||
| id | UUID | Primary key |
|
||||
| email | String | Unique, for email/password auth |
|
||||
| passwordHash | String? | Nullable for OAuth-only users |
|
||||
| name | String | Display name |
|
||||
| avatarUrl | String? | Profile picture |
|
||||
| createdAt | DateTime | |
|
||||
| updatedAt | DateTime | |
|
||||
|
||||
### OAuthAccount
|
||||
| Field | Type | Notes |
|
||||
|-------|------|-------|
|
||||
| id | UUID | Primary key |
|
||||
| userId | UUID | FK → User |
|
||||
| provider | String | "github" or "google" |
|
||||
| providerAccountId | String | External account ID |
|
||||
| createdAt | DateTime | |
|
||||
|
||||
### Project
|
||||
| Field | Type | Notes |
|
||||
|-------|------|-------|
|
||||
| id | UUID | Primary key, exposed as project ID |
|
||||
| userId | UUID | FK → User (owner) |
|
||||
| name | String | Project display name |
|
||||
| description | String? | Optional description |
|
||||
| baseUrl | String? | API base URL |
|
||||
| openApiSpec | JSONB | Full dereferenced OpenAPI document |
|
||||
| openApiVersion | String | e.g., "3.0.3", "3.1.0" |
|
||||
| apiKeyHash | String | Hashed API key for MCP access |
|
||||
| createdAt | DateTime | |
|
||||
| updatedAt | DateTime | |
|
||||
|
||||
### Module
|
||||
| Field | Type | Notes |
|
||||
|-------|------|-------|
|
||||
| id | UUID | Primary key |
|
||||
| projectId | UUID | FK → Project |
|
||||
| name | String | Module name (from tag or path prefix) |
|
||||
| description | String? | Module description |
|
||||
| sortOrder | Int | Display order |
|
||||
| source | Enum | "tag", "path_prefix", "manual" |
|
||||
| createdAt | DateTime | |
|
||||
| updatedAt | DateTime | |
|
||||
|
||||
### Endpoint
|
||||
| Field | Type | Notes |
|
||||
|-------|------|-------|
|
||||
| id | UUID | Primary key |
|
||||
| projectId | UUID | FK → Project |
|
||||
| moduleId | UUID | FK → Module |
|
||||
| method | String | HTTP method (GET, POST, PUT, DELETE, etc.) |
|
||||
| path | String | URL path (e.g., /api/users/{id}) |
|
||||
| summary | String? | Short description |
|
||||
| description | String? | Detailed description |
|
||||
| operationId | String? | OpenAPI operationId |
|
||||
| parameters | JSONB | Path, query, header parameters |
|
||||
| requestBody | JSONB? | Request body schema |
|
||||
| responses | JSONB | Response schemas by status code |
|
||||
| tags | String[] | Original OpenAPI tags |
|
||||
| deprecated | Boolean | Whether the endpoint is deprecated |
|
||||
| createdAt | DateTime | |
|
||||
| updatedAt | DateTime | |
|
||||
|
||||
## MCP Multi-Level Retrieval Design (Core Feature)
|
||||
|
||||
The MCP service exposes 5 tools that enable LLMs to progressively drill down into API documentation, minimizing token usage at each step.
|
||||
|
||||
### Tool Definitions
|
||||
|
||||
#### 1. `get_project_overview`
|
||||
- **Description** (shown to LLM): "Get an overview of this API project including its name, version, base URL, and a summary of available modules with endpoint counts. Call this first to understand what the API offers. This is usually sufficient to decide which module to drill into."
|
||||
- **Input**: (none — projectId comes from the MCP connection URL)
|
||||
- **Output**: `{ name, description, version, baseUrl, totalEndpoints, modules: [{ id, name, endpointCount }] }`
|
||||
- **Estimated tokens**: ~200
|
||||
- **Note**: This is the recommended entry point. It provides a compact overview including module names and counts, enough for LLMs to decide next steps.
|
||||
|
||||
#### 2. `list_modules`
|
||||
- **Description**: "List all API modules/groups with their descriptions. Each module contains related endpoints. Use this when you need module descriptions to decide which module to explore."
|
||||
- **Input**: (none)
|
||||
- **Output**: `[{ id, name, description, endpointCount }]`
|
||||
- **Estimated tokens**: ~100-300
|
||||
- **Note**: Differs from `get_project_overview` by including module descriptions. Use when the module name alone isn't enough to determine relevance.
|
||||
|
||||
#### 3. `list_endpoints`
|
||||
- **Description**: "List all endpoints in a specific module. Returns method, path, and summary for each endpoint. Use get_endpoint_detail to get full information about a specific endpoint."
|
||||
- **Input**: `{ moduleId: string }`
|
||||
- **Output**: `[{ id, method, path, summary, deprecated }]`
|
||||
- **Estimated tokens**: ~200-500
|
||||
|
||||
#### 4. `get_endpoint_detail`
|
||||
- **Description**: "Get complete details for a specific endpoint including parameters, request body schema, response schemas, and examples. Use this when you need to understand exactly how to call an endpoint."
|
||||
- **Input**: `{ endpointId: string }`
|
||||
- **Output**: `{ method, path, summary, description, parameters, requestBody, responses, deprecated }`
|
||||
- **Estimated tokens**: ~500-2000
|
||||
|
||||
#### 5. `search_endpoints`
|
||||
- **Description**: "Search for endpoints by keyword. Searches across path, summary, description, operationId, and parameter names. Optionally filter by module. Returns matching endpoint summaries."
|
||||
- **Input**: `{ keyword: string, moduleId?: string }`
|
||||
- **Output**: `[{ id, method, path, summary, moduleName, deprecated }]`
|
||||
- **Estimated tokens**: ~200-500
|
||||
|
||||
### Retrieval Flow Example
|
||||
|
||||
```
|
||||
LLM wants to call "create user" endpoint:
|
||||
1. get_project_overview() → ~200 tokens (see all modules)
|
||||
2. list_endpoints({ moduleId: "users" }) → ~300 tokens (see user endpoints)
|
||||
3. get_endpoint_detail({ endpointId: "..." }) → ~800 tokens (get full details)
|
||||
Total: ~1,300 tokens (vs 10,000+ for full doc dump)
|
||||
```
|
||||
|
||||
### MCP Authentication
|
||||
|
||||
- MCP endpoint URL: `https://host/mcp/:projectId`
|
||||
- Auth via `Authorization: Bearer <project-api-key>` header
|
||||
- API key is validated against the project's `apiKeyHash`
|
||||
- Each project has its own isolated MCP instance
|
||||
|
||||
### MCP Transport
|
||||
|
||||
Support both transport protocols:
|
||||
- **Streamable HTTP** (new standard): POST/GET/DELETE on `/mcp/:projectId`
|
||||
- **SSE** (legacy): GET with SSE on `/mcp/:projectId/sse`, POST on `/mcp/:projectId/messages`
|
||||
|
||||
Use `@modelcontextprotocol/express` middleware with session management for stateful connections.
|
||||
|
||||
## Frontend Pages
|
||||
|
||||
### 1. Auth Pages
|
||||
- Login: email/password form + GitHub/Google OAuth buttons
|
||||
- Register: email/password form + OAuth
|
||||
|
||||
### 2. Projects List Page
|
||||
- Card grid of user's projects
|
||||
- Each card: project name, version, endpoint count, created date
|
||||
- Create project button → import flow
|
||||
|
||||
### 3. Project Detail Page (Tabbed)
|
||||
|
||||
**Tab: Documentation Preview**
|
||||
- Interactive API doc browser (similar to Swagger UI)
|
||||
- Left sidebar: module list (collapsible)
|
||||
- Main area: endpoint list grouped by module, expandable to show details
|
||||
- Supports try-it-out (optional, future feature)
|
||||
|
||||
**Tab: Module Management**
|
||||
- View auto-generated modules
|
||||
- Drag-and-drop reorder
|
||||
- Move endpoints between modules
|
||||
- Create/rename/delete modules
|
||||
|
||||
**Tab: MCP Integration**
|
||||
- MCP service URL (copyable)
|
||||
- API Key display (masked, with copy and rotate buttons)
|
||||
- Configuration snippet for Claude Code, Cursor, etc. (copyable JSON)
|
||||
- Connection status indicator
|
||||
|
||||
**Tab: Settings**
|
||||
- Project name/description editing
|
||||
- Re-import OpenAPI document (with diff preview)
|
||||
- Danger zone: delete project
|
||||
|
||||
### Import Flow
|
||||
1. User uploads JSON/YAML file OR pastes URL
|
||||
2. Backend validates with `@apidevtools/swagger-parser`
|
||||
3. Backend dereferences all `$ref` pointers
|
||||
4. Parses tags → Module records, paths → Endpoint records
|
||||
5. Endpoints without tags auto-grouped by path prefix (first segment)
|
||||
6. Preview shown to user with module/endpoint breakdown
|
||||
7. User confirms → data saved
|
||||
|
||||
## Backend API
|
||||
|
||||
### Auth Routes
|
||||
```
|
||||
POST /api/auth/register # Email registration
|
||||
POST /api/auth/login # Email login → returns JWT pair
|
||||
POST /api/auth/refresh # Refresh access token
|
||||
GET /api/auth/github # GitHub OAuth redirect
|
||||
GET /api/auth/google # Google OAuth redirect
|
||||
GET /api/auth/callback/:provider # OAuth callback
|
||||
```
|
||||
|
||||
### Project Routes
|
||||
```
|
||||
GET /api/projects # List user's projects
|
||||
POST /api/projects # Create project (upload OpenAPI doc)
|
||||
GET /api/projects/:id # Get project details
|
||||
PUT /api/projects/:id # Update project metadata
|
||||
DELETE /api/projects/:id # Delete project
|
||||
```
|
||||
|
||||
### Module/Endpoint Routes
|
||||
```
|
||||
GET /api/projects/:id/modules # List modules
|
||||
PUT /api/projects/:id/modules/:mid # Update module (rename, reorder)
|
||||
POST /api/projects/:id/modules # Create manual module
|
||||
DELETE /api/projects/:id/modules/:mid # Delete module
|
||||
PATCH /api/projects/:id/endpoints/:eid # Move endpoint to different module
|
||||
```
|
||||
|
||||
### Import/Key Routes
|
||||
```
|
||||
POST /api/projects/:id/reimport # Re-import OpenAPI document
|
||||
POST /api/projects/:id/api-key/rotate # Rotate API key
|
||||
```
|
||||
|
||||
### Authentication Design
|
||||
- **User auth**: JWT dual-token (access: 15min, refresh: 7d)
|
||||
- **Password**: bcrypt hashed
|
||||
- **OAuth**: Passport.js strategies for GitHub and Google
|
||||
- **MCP auth**: Project-level API key (independent from user JWT)
|
||||
- API key format: `afk_` prefix + 32-char random string
|
||||
- API key stored as bcrypt hash in database
|
||||
|
||||
## Deployment (Docker Compose)
|
||||
|
||||
All services containerized and orchestrated via Docker Compose for one-command deployment.
|
||||
|
||||
### Services
|
||||
|
||||
```yaml
|
||||
services:
|
||||
postgres: # PostgreSQL 16
|
||||
server: # Backend API (Express) - port 3000
|
||||
mcp: # MCP service (Express) - port 3001
|
||||
web: # Frontend (Nginx serving static build) - port 80
|
||||
redis: # Optional: session store / rate limiting cache
|
||||
```
|
||||
|
||||
### Container Details
|
||||
|
||||
| Service | Base Image | Notes |
|
||||
|---------|-----------|-------|
|
||||
| `postgres` | `postgres:16-alpine` | Persistent volume for data, init scripts for DB creation |
|
||||
| `server` | `node:20-alpine` | Multi-stage build: build → runtime only |
|
||||
| `mcp` | `node:20-alpine` | Same multi-stage build pattern |
|
||||
| `web` | `node:20-alpine` → `nginx:alpine` | Build stage + Nginx serve stage |
|
||||
| `redis` | `redis:7-alpine` | Optional, for rate limiting and session cache |
|
||||
|
||||
### Docker Files
|
||||
|
||||
```
|
||||
agent-fox/
|
||||
├── docker-compose.yml # Orchestration
|
||||
├── docker-compose.dev.yml # Dev overrides (hot reload, debug ports)
|
||||
├── packages/
|
||||
│ ├── web/Dockerfile
|
||||
│ ├── server/Dockerfile
|
||||
│ └── mcp/Dockerfile
|
||||
└── .env.example # Environment variable template
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Managed via `.env` file (git-ignored), with `.env.example` as template:
|
||||
- `DATABASE_URL` — PostgreSQL connection string
|
||||
- `JWT_SECRET` — JWT signing key
|
||||
- `GITHUB_CLIENT_ID`, `GITHUB_CLIENT_SECRET` — GitHub OAuth
|
||||
- `GOOGLE_CLIENT_ID`, `GOOGLE_CLIENT_SECRET` — Google OAuth
|
||||
- `MCP_BASE_URL` — Public URL for MCP service
|
||||
- `REDIS_URL` — Optional Redis connection
|
||||
|
||||
### Dev vs Prod
|
||||
|
||||
- **Dev** (`docker-compose -f docker-compose.yml -f docker-compose.dev.yml up`): Source mounted as volumes, hot reload enabled, debug ports exposed
|
||||
- **Prod** (`docker-compose up`): Optimized multi-stage builds, no source mounting, Nginx serves frontend
|
||||
|
||||
## Error Handling
|
||||
|
||||
- All API responses follow `{ success: boolean, data?: T, error?: { code, message } }` format
|
||||
- Zod validation on all inputs
|
||||
- MCP tools return structured error messages that help LLMs self-correct
|
||||
- Rate limiting on MCP endpoints to prevent abuse
|
||||
|
||||
## Verification Plan
|
||||
|
||||
### Unit Tests
|
||||
- OpenAPI parsing: validate correct module/endpoint extraction from sample docs
|
||||
- MCP tools: verify each tool returns correct data shape and respects scoping
|
||||
- Auth: test JWT generation, validation, refresh flow
|
||||
|
||||
### Integration Tests
|
||||
- Full import flow: upload OpenAPI doc → verify modules/endpoints created correctly
|
||||
- MCP retrieval flow: simulate LLM calling tools in sequence
|
||||
- Auth flow: register → login → access protected routes
|
||||
|
||||
### Manual Testing
|
||||
- Import Petstore OpenAPI sample → verify preview and module grouping
|
||||
- Configure Claude Code with generated MCP config → verify tools work
|
||||
- Test search with various keywords → verify relevance
|
||||
37
packages/mcp/Dockerfile
Normal file
37
packages/mcp/Dockerfile
Normal file
@@ -0,0 +1,37 @@
|
||||
FROM node:20-alpine AS base
|
||||
RUN corepack enable && corepack prepare pnpm@latest --activate
|
||||
WORKDIR /app
|
||||
|
||||
FROM base AS deps
|
||||
COPY pnpm-lock.yaml pnpm-workspace.yaml package.json tsconfig.base.json ./
|
||||
COPY packages/shared/package.json packages/shared/
|
||||
COPY packages/shared/tsconfig.json packages/shared/
|
||||
COPY packages/mcp/package.json packages/mcp/
|
||||
COPY packages/mcp/tsconfig.json packages/mcp/
|
||||
COPY prisma/ prisma/
|
||||
RUN pnpm install --frozen-lockfile
|
||||
|
||||
FROM base AS build
|
||||
COPY --from=deps /app/ ./
|
||||
COPY packages/shared/ packages/shared/
|
||||
COPY packages/mcp/ packages/mcp/
|
||||
COPY prisma/ prisma/
|
||||
COPY tsconfig.base.json ./
|
||||
RUN npx prisma generate --schema=prisma/schema.prisma
|
||||
RUN pnpm --filter @agent-fox/shared build
|
||||
RUN pnpm --filter @agent-fox/mcp build
|
||||
|
||||
FROM node:20-alpine AS runtime
|
||||
WORKDIR /app
|
||||
COPY --from=build /app/node_modules ./node_modules
|
||||
COPY --from=build /app/packages/shared/dist ./packages/shared/dist
|
||||
COPY --from=build /app/packages/shared/node_modules ./packages/shared/node_modules
|
||||
COPY --from=build /app/packages/shared/package.json ./packages/shared/
|
||||
COPY --from=build /app/packages/mcp/dist ./packages/mcp/dist
|
||||
COPY --from=build /app/packages/mcp/node_modules ./packages/mcp/node_modules
|
||||
COPY --from=build /app/packages/mcp/package.json ./packages/mcp/
|
||||
COPY --from=build /app/prisma ./prisma
|
||||
|
||||
WORKDIR /app/packages/mcp
|
||||
EXPOSE 3001
|
||||
CMD ["node", "dist/index.js"]
|
||||
40
packages/server/Dockerfile
Normal file
40
packages/server/Dockerfile
Normal file
@@ -0,0 +1,40 @@
|
||||
FROM node:20-alpine AS base
|
||||
RUN corepack enable && corepack prepare pnpm@latest --activate
|
||||
WORKDIR /app
|
||||
|
||||
FROM base AS deps
|
||||
COPY pnpm-lock.yaml pnpm-workspace.yaml package.json tsconfig.base.json ./
|
||||
COPY packages/shared/package.json packages/shared/
|
||||
COPY packages/shared/tsconfig.json packages/shared/
|
||||
COPY packages/server/package.json packages/server/
|
||||
COPY packages/server/tsconfig.json packages/server/
|
||||
COPY prisma/ prisma/
|
||||
RUN pnpm install --frozen-lockfile
|
||||
|
||||
FROM base AS build
|
||||
COPY --from=deps /app/ ./
|
||||
COPY packages/shared/ packages/shared/
|
||||
COPY packages/server/ packages/server/
|
||||
COPY prisma/ prisma/
|
||||
COPY tsconfig.base.json ./
|
||||
RUN npx prisma generate --schema=prisma/schema.prisma
|
||||
RUN pnpm --filter @agent-fox/shared build
|
||||
RUN pnpm --filter @agent-fox/server build
|
||||
|
||||
FROM node:20-alpine AS runtime
|
||||
WORKDIR /app
|
||||
COPY --from=build /app/node_modules ./node_modules
|
||||
COPY --from=build /app/packages/shared/dist ./packages/shared/dist
|
||||
COPY --from=build /app/packages/shared/node_modules ./packages/shared/node_modules
|
||||
COPY --from=build /app/packages/shared/package.json ./packages/shared/
|
||||
COPY --from=build /app/packages/server/dist ./packages/server/dist
|
||||
COPY --from=build /app/packages/server/node_modules ./packages/server/node_modules
|
||||
COPY --from=build /app/packages/server/package.json ./packages/server/
|
||||
COPY --from=build /app/prisma ./prisma
|
||||
COPY scripts/migrate-and-start.sh ./scripts/
|
||||
|
||||
RUN chmod +x scripts/migrate-and-start.sh
|
||||
RUN npm install -g prisma@6
|
||||
|
||||
EXPOSE 3000
|
||||
CMD ["sh", "scripts/migrate-and-start.sh"]
|
||||
@@ -1,7 +1,8 @@
|
||||
import SwaggerParser from '@apidevtools/swagger-parser';
|
||||
import type { OpenAPIV3, OpenAPIV3_1 } from 'openapi-types';
|
||||
import type { OpenAPI, OpenAPIV2, OpenAPIV3, OpenAPIV3_1 } from 'openapi-types';
|
||||
|
||||
type OpenApiDoc = OpenAPIV3.Document | OpenAPIV3_1.Document;
|
||||
type SwaggerDoc = OpenAPIV2.Document;
|
||||
|
||||
export type ParsedModule = {
|
||||
name: string;
|
||||
@@ -34,27 +35,55 @@ export type ParseResult = {
|
||||
endpoints: ParsedEndpoint[];
|
||||
};
|
||||
|
||||
export async function parseOpenApiDocument(input: string | object): Promise<ParseResult> {
|
||||
const rawApi = await SwaggerParser.validate(input as any);
|
||||
const api = await SwaggerParser.dereference(rawApi as any) as OpenApiDoc;
|
||||
function isSwagger2(api: OpenAPI.Document): api is SwaggerDoc {
|
||||
return 'swagger' in api && (api as any).swagger?.startsWith('2.');
|
||||
}
|
||||
|
||||
const openApiVersion = 'openapi' in api ? api.openapi : 'unknown';
|
||||
const name = api.info.title;
|
||||
const description = api.info.description || null;
|
||||
const version = api.info.version;
|
||||
const baseUrl = api.servers?.[0]?.url || null;
|
||||
function parseSwagger2Endpoints(api: SwaggerDoc): { endpoints: ParsedEndpoint[]; baseUrl: string | null } {
|
||||
const baseUrl = api.basePath || (api.host ? `http://${api.host}${api.basePath || ''}` : null);
|
||||
const endpoints: ParsedEndpoint[] = [];
|
||||
const paths = api.paths || {};
|
||||
|
||||
const tagMap = new Map<string, string | null>();
|
||||
if (api.tags) {
|
||||
for (const tag of api.tags) {
|
||||
tagMap.set(tag.name, tag.description || null);
|
||||
for (const [pathStr, pathItem] of Object.entries(paths)) {
|
||||
if (!pathItem) continue;
|
||||
const methods = ['get', 'post', 'put', 'delete', 'patch', 'head', 'options'] as const;
|
||||
for (const method of methods) {
|
||||
const operation = (pathItem as Record<string, unknown>)[method] as OpenAPIV2.OperationObject | undefined;
|
||||
if (!operation) continue;
|
||||
|
||||
const endpointTags = operation.tags || [];
|
||||
const prefix = pathStr.split('/').filter(Boolean)[0] || 'default';
|
||||
const moduleName = endpointTags[0] || prefix;
|
||||
|
||||
// Convert Swagger 2 body parameter to requestBody-like structure
|
||||
const params = (operation.parameters || []) as OpenAPIV2.Parameter[];
|
||||
const bodyParam = params.find((p: any) => p.in === 'body');
|
||||
const nonBodyParams = params.filter((p: any) => p.in !== 'body');
|
||||
|
||||
endpoints.push({
|
||||
method: method.toUpperCase(),
|
||||
path: pathStr,
|
||||
summary: operation.summary || null,
|
||||
description: operation.description || null,
|
||||
operationId: operation.operationId || null,
|
||||
parameters: nonBodyParams as unknown[],
|
||||
requestBody: bodyParam ? { schema: (bodyParam as any).schema } : null,
|
||||
responses: (operation.responses || {}) as Record<string, unknown>,
|
||||
tags: endpointTags,
|
||||
deprecated: operation.deprecated || false,
|
||||
moduleName,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
const endpoints: ParsedEndpoint[] = [];
|
||||
const usedTags = new Set<string>();
|
||||
return { endpoints, baseUrl };
|
||||
}
|
||||
|
||||
function parseOpenApi3Endpoints(api: OpenApiDoc): { endpoints: ParsedEndpoint[]; baseUrl: string | null } {
|
||||
const baseUrl = api.servers?.[0]?.url || null;
|
||||
const endpoints: ParsedEndpoint[] = [];
|
||||
const paths = api.paths || {};
|
||||
|
||||
for (const [pathStr, pathItem] of Object.entries(paths)) {
|
||||
if (!pathItem) continue;
|
||||
const methods = ['get', 'post', 'put', 'delete', 'patch', 'head', 'options'] as const;
|
||||
@@ -63,11 +92,6 @@ export async function parseOpenApiDocument(input: string | object): Promise<Pars
|
||||
if (!operation) continue;
|
||||
|
||||
const endpointTags = operation.tags || [];
|
||||
for (const tag of endpointTags) {
|
||||
usedTags.add(tag);
|
||||
if (!tagMap.has(tag)) tagMap.set(tag, null);
|
||||
}
|
||||
|
||||
const prefix = pathStr.split('/').filter(Boolean)[0] || 'default';
|
||||
const moduleName = endpointTags[0] || prefix;
|
||||
|
||||
@@ -87,6 +111,55 @@ export async function parseOpenApiDocument(input: string | object): Promise<Pars
|
||||
}
|
||||
}
|
||||
|
||||
return { endpoints, baseUrl };
|
||||
}
|
||||
|
||||
export async function parseOpenApiDocument(input: string | object): Promise<ParseResult> {
|
||||
let specInput: string | object = input;
|
||||
|
||||
// If input is a URL, fetch the content first so that swagger-parser
|
||||
// works on a plain object and doesn't need network access for $ref resolution
|
||||
if (typeof input === 'string' && input.startsWith('http')) {
|
||||
const res = await fetch(input);
|
||||
if (!res.ok) throw new Error(`Failed to fetch spec from URL: ${res.status} ${res.statusText}`);
|
||||
specInput = await res.json();
|
||||
}
|
||||
|
||||
// Bundle resolves all $refs into a single document, then dereference inlines them
|
||||
const bundled = await SwaggerParser.bundle(specInput as any) as OpenAPI.Document;
|
||||
const api = await SwaggerParser.dereference(bundled, {
|
||||
dereference: { circular: 'ignore' },
|
||||
}) as OpenAPI.Document;
|
||||
|
||||
const isV2 = isSwagger2(api);
|
||||
const openApiVersion = isV2 ? (api as SwaggerDoc).swagger : ('openapi' in api ? (api as any).openapi : 'unknown');
|
||||
const name = api.info.title;
|
||||
const description = api.info.description || null;
|
||||
const version = api.info.version;
|
||||
|
||||
// Collect tags
|
||||
const tagMap = new Map<string, string | null>();
|
||||
if (api.tags) {
|
||||
for (const tag of api.tags) {
|
||||
tagMap.set(tag.name, tag.description || null);
|
||||
}
|
||||
}
|
||||
|
||||
// Parse endpoints based on spec version
|
||||
const { endpoints, baseUrl } = isV2
|
||||
? parseSwagger2Endpoints(api as SwaggerDoc)
|
||||
: parseOpenApi3Endpoints(api as OpenApiDoc);
|
||||
|
||||
// Track used tags
|
||||
const usedTags = new Set<string>();
|
||||
for (const ep of endpoints) {
|
||||
for (const tag of ep.tags) {
|
||||
usedTags.add(tag);
|
||||
if (!tagMap.has(tag)) tagMap.set(tag, null);
|
||||
}
|
||||
}
|
||||
|
||||
// Build modules
|
||||
const modules: ParsedModule[] = [];
|
||||
const moduleNames = new Set<string>();
|
||||
|
||||
|
||||
17
packages/web/Dockerfile
Normal file
17
packages/web/Dockerfile
Normal file
@@ -0,0 +1,17 @@
|
||||
FROM node:20-alpine AS build
|
||||
RUN corepack enable && corepack prepare pnpm@latest --activate
|
||||
WORKDIR /app
|
||||
|
||||
COPY pnpm-lock.yaml pnpm-workspace.yaml package.json ./
|
||||
COPY packages/web/package.json packages/web/
|
||||
RUN pnpm install --frozen-lockfile --filter @agent-fox/web...
|
||||
|
||||
COPY packages/web/ packages/web/
|
||||
COPY tsconfig.base.json ./
|
||||
RUN pnpm --filter @agent-fox/web build
|
||||
|
||||
FROM nginx:alpine
|
||||
COPY --from=build /app/packages/web/dist /usr/share/nginx/html
|
||||
COPY packages/web/nginx.conf /etc/nginx/conf.d/default.conf
|
||||
EXPOSE 80
|
||||
CMD ["nginx", "-g", "daemon off;"]
|
||||
25
packages/web/nginx.conf
Normal file
25
packages/web/nginx.conf
Normal file
@@ -0,0 +1,25 @@
|
||||
server {
|
||||
listen 80;
|
||||
root /usr/share/nginx/html;
|
||||
index index.html;
|
||||
|
||||
location /api/ {
|
||||
proxy_pass http://server:3000;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
}
|
||||
|
||||
location /mcp/ {
|
||||
proxy_pass http://mcp:3001;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Connection '';
|
||||
proxy_buffering off;
|
||||
proxy_cache off;
|
||||
}
|
||||
|
||||
location / {
|
||||
try_files $uri $uri/ /index.html;
|
||||
}
|
||||
}
|
||||
@@ -66,7 +66,13 @@ export async function apiFetch<T>(path: string, options: RequestInit = {}): Prom
|
||||
}
|
||||
}
|
||||
|
||||
const json: ApiResponse<T> = await res.json();
|
||||
const text = await res.text();
|
||||
let json: ApiResponse<T>;
|
||||
try {
|
||||
json = JSON.parse(text);
|
||||
} catch {
|
||||
throw new Error(`Server error (${res.status})`);
|
||||
}
|
||||
if (!json.success) {
|
||||
throw new Error(json.error?.message || 'Request failed');
|
||||
}
|
||||
|
||||
@@ -6,17 +6,20 @@ type Project = { id: string; name: string };
|
||||
|
||||
export default function McpIntegration({ project }: { project: Project }) {
|
||||
const [apiKey, setApiKey] = useState<string | null>(null);
|
||||
const mcpBaseUrl = window.location.origin;
|
||||
const mcpUrl = `${mcpBaseUrl}/mcp/${project.id}`;
|
||||
const mcpHost = window.location.hostname;
|
||||
const mcpUrl = `http://${mcpHost}:3001/mcp/${project.id}`;
|
||||
|
||||
const rotateMutation = useMutation({
|
||||
mutationFn: () => apiFetch<{ apiKey: string }>(`/projects/${project.id}/api-key/rotate`, { method: 'POST' }),
|
||||
onSuccess: (data) => setApiKey(data.apiKey),
|
||||
});
|
||||
|
||||
const serverName = project.name.toLowerCase().replace(/[^a-z0-9]+/g, '-').replace(/(^-|-$)/g, '');
|
||||
|
||||
const configSnippet = JSON.stringify({
|
||||
mcpServers: {
|
||||
[project.name.toLowerCase().replace(/\s+/g, '-')]: {
|
||||
[serverName]: {
|
||||
type: 'http',
|
||||
url: mcpUrl,
|
||||
headers: { Authorization: `Bearer ${apiKey || '<your-api-key>'}` },
|
||||
},
|
||||
|
||||
@@ -7,7 +7,7 @@ export default defineConfig({
|
||||
server: {
|
||||
port: 5173,
|
||||
proxy: {
|
||||
'/api': 'http://localhost:3000',
|
||||
'/api': process.env.API_URL || 'http://localhost:3000',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
9
scripts/migrate-and-start.sh
Executable file
9
scripts/migrate-and-start.sh
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/bin/sh
|
||||
set -e
|
||||
|
||||
echo "Running database migrations..."
|
||||
npx prisma migrate deploy --schema=prisma/schema.prisma
|
||||
|
||||
echo "Starting server..."
|
||||
cd /app/packages/server
|
||||
exec node dist/index.js
|
||||
Reference in New Issue
Block a user