Modern Model Context Protocol (MCP) server proxy enabling fast, type-safe integration with multiple AI providers like Anthropic Claude and OpenAI, deployed globally on Cloudflare Workers with CORS support and health monitoring.
Unlock the full potential of MCP Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.
Free tier available • No credit card required
A modern AI service proxy built with Cloudflare Workers and Hono framework, supporting multiple AI providers including Anthropic Claude and OpenAI.
pnpm install
cp .env.example .env
.env
with your API keys and preferencesStart the development server:
pnpm run dev
The server will start in development mode with hot reloading enabled.
Deploy to Cloudflare Workers:
pnpm run deploy
GET /health
GET /api/provider
POST /api/mcp
├── src/
│ ├── controllers/ # Request handlers
│ ├── models/ # Type definitions
│ ├── services/ # AI service implementations
│ └── index.ts # Main application entry
├── public/ # Static assets
└── wrangler.jsonc # Cloudflare Workers configuration
MIT
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!