A server based on Model Context Protocol that parses Swagger/OpenAPI documents and generates TypeScript types and API client code for different frameworks (Axios, Fetch, React Query).
一个基于Model Context Protocol (MCP)的服务器,用于解析Swagger/OpenAPI文档并生成TypeScript类型和API客户端代码。
npm install # 或者使用pnpm pnpm install
node start-server.js
服务器默认使用标准输入/输出通信。
可以通过标准输入/输出与MCP服务器通信。以下是一些示例:
# 解析Swagger文档 node examples/optimized-swagger-parser-example.js # 生成TypeScript类型 node examples/typescript-generator-example.js # 生成API客户端 node examples/api-client-generator-example.js
{ "method": "parse-swagger", "params": { "url": "https://petstore3.swagger.io/api/v3/openapi.json", "includeSchemas": true, "includeDetails": true } }
适用于完整解析,带有高级选项:
{ "method": "parse-swagger-optimized", "params": { "url": "https://petstore3.swagger.io/api/v3/openapi.json", "includeSchemas": true, "includeDetails": true, "useCache": true, "skipValidation": true, "cacheTTLMinutes": 60, "lazyLoading": false, "filterTag": "pet" } }
为大型文档优化,快速但只返回基本信息:
{ "method": "parse-swagger-lite", "params": { "url": "https://petstore3.swagger.io/api/v3/openapi.json", "includeSchemas": false, "includeDetails": false, "useCache": true, "skipValidation": true } }
{ "method": "generate-typescript-types", "params": { "swaggerUrl": "https://petstore3.swagger.io/api/v3/openapi.json", "outputDir": "./generated/types", "namespace": "PetStore", "strictTypes": true, "generateEnums": true, "generateIndex": true } }
{ "method": "generate-typescript-types-optimized", "params": { "swaggerUrl": "https://petstore3.swagger.io/api/v3/openapi.json", "outputDir": "./generated/types", "namespace": "PetStore", "strictTypes": true, "useCache": true, "skipValidation": true, "lazyLoading": true, "includeSchemas": ["Pet", "Order", "User"] } }
{ "method": "generate-api-client", "params": { "swaggerUrl": "https://petstore3.swagger.io/api/v3/openapi.json", "outputDir": "./generated/api", "clientType": "axios", "generateTypeImports": true, "typesImportPath": "../types", "groupBy": "tag" } }
{ "method": "generate-api-client-optimized", "params": { "swaggerUrl": "https://petstore3.swagger.io/api/v3/openapi.json", "outputDir": "./generated/api", "clientType": "react-query", "generateTypeImports": true, "typesImportPath": "../types", "groupBy": "tag", "useCache": true, "skipValidation": true, "lazyLoading": true, "includeTags": ["pet", "store"] } }
{ "method": "file-writer", "params": { "filePath": "./output.txt", "content": "Hello, world!", "createDirs": true } }
对于大型API文档,推荐使用以下配置:
示例:
{ "method": "parse-swagger-lite", "params": { "url": "https://your-large-api-doc-url.json", "useCache": true, "skipValidation": true, "lazyLoading": true, "filterTag": "your-specific-tag", "includeSchemas": false } }
目前支持以下API客户端框架:
示例 - 生成React Query客户端:
{ "method": "generate-api-client-optimized", "params": { "swaggerUrl": "https://petstore3.swagger.io/api/v3/openapi.json", "outputDir": "./generated/react-query", "clientType": "react-query", "generateTypeImports": true } }
API文档缓存存储在 .api-cache
目录中。如果需要清除缓存:
.api-cache
目录useCache: false
参数可在 swagger-mcp-config.json
中自定义服务器设置:
{ "name": "Swagger MCP Server", "version": "1.0.0", "transport": "stdio" }
启动调试服务器:
node start-server.js
然后使用MCP Inspector连接:
npx @modelcontextprotocol/inspector pipe -- node start-server.js
或者直接方式(但可能导致输出混乱):
npx @modelcontextprotocol/inspector -- node start-server.js
参见 road.md 文件了解开发计划和进度。
要通过Smithery为Claude Desktop自动安装swagger-mcp-server:
npx -y @smithery/cli install @tuskermanshu/swagger-mcp-server --client claude
# 构建项目 npm run build # 或使用pnpm pnpm build
parse-swagger
- 解析Swagger/OpenAPI文档,返回API操作信息parse-swagger-optimized
- 解析Swagger/OpenAPI文档(优化版)parse-swagger-lite
- 轻量级解析Swagger/OpenAPI文档,专为大型文档优化generate-typescript-types
- 从Swagger/OpenAPI文档生成TypeScript类型定义generate-typescript-types-optimized
- 从Swagger/OpenAPI文档生成TypeScript类型定义(优化版)generate-api-client
- 从Swagger/OpenAPI文档生成API客户端代码generate-api-client-optimized
- 从Swagger/OpenAPI文档生成API客户端代码(优化版)file-writer
- 将内容写入文件系统Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!