An open standard server implementation that enables AI assistants to directly access APIs and services through Model Context Protocol, built using Cloudflare Workers for scalability.
Model Context Protocol (MCP) is an open standard that enables AI agents and assistants to interact with services. By setting up an MCP server, you can allow AI assistants to access your APIs directly.
Cloudflare Workers, combined with the workers-mcp
package, provide a powerful and scalable solution for building MCP servers.
Before starting, ensure you have:
npm install -g wrangler
)First, initialize a new Cloudflare Worker project:
npx create-cloudflare@latest my-mcp-worker cd my-mcp-worker
Then, authenticate your Cloudflare account:
wrangler login
Update your wrangler.toml
file with the correct account details:
name = "my-mcp-worker" main = "src/index.ts" compatibility_date = "2025-03-03" account_id = "your-account-id"
To enable MCP support, install the workers-mcp
package:
npm install workers-mcp
Run the setup command to configure MCP:
npx workers-mcp setup
This will:
Update your src/index.ts
to define your MCP server:
import { WorkerEntrypoint } from 'cloudflare:workers'; import { ProxyToSelf } from 'workers-mcp'; export default class MyWorker extends WorkerEntrypoint { /** * A friendly greeting from your MCP server. * @param name {string} The name of the user. * @return {string} A personalized greeting. */ sayHello(name: string) { return `Hello from an MCP Worker, ${name}!`; } /** * @ignore */ async fetch(request: Request): Promise { return new ProxyToSelf(this).fetch(request); } }
You can extend your MCP server by integrating with external APIs. Here's an example of fetching weather data:
export default class WeatherWorker extends WorkerEntrypoint { /** * Fetch weather data for a given location. * @param location {string} The city or ZIP code. * @return {object} Weather details. */ async getWeather(location: string) { const response = await fetch(`https://api.weather.example/v1/${location}`); const data = await response.json(); return { temperature: data.temp, conditions: data.conditions, forecast: data.forecast }; } async fetch(request: Request): Promise { return new ProxyToSelf(this).fetch(request); } }
Once your Worker is set up, deploy it to Cloudflare:
npx wrangler deploy
After deployment, your Worker is live and AI assistants can discover and use your MCP tools.
To update your MCP server, redeploy with:
npm run deploy
To test your MCP setup locally:
npx workers-mcp proxy
This command starts a local proxy allowing MCP clients (like Claude Desktop) to connect.
To secure your MCP server, use Wrangler Secrets:
npx wrangler secret put MCP_SECRET
This adds a shared-secret authentication mechanism to prevent unauthorized access.
Congratulations! You have successfully built and deployed an MCP server using Cloudflare Workers. You can now extend it with more features and expose new tools for AI assistants.
For more details, check the Cloudflare MCP documentation.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!