A TypeScript implementation of a Model Context Protocol server that provides a frictionless framework for developers to build and deploy AI tools and prompts, focusing on developer experience with zero boilerplate and automatic tool registration.
SYSTEMS ONLINE • NEURAL INTERFACE ACTIVE • COMBAT DATA ANALYSIS • TACTICAL OVERLAY ENABLED • PERFORMANCE METRICS NOMINAL
A production-ready MCP server template for hosting your own AI tools and prompts. Deploy remotely or run locally - built for developers who want to expose their tools to AI models without the infrastructure headaches.
This template is perfect for:
Whether you want to:
This template provides everything you need to get started.
The MCP server provides two ways to expose your tools to AI models:
Remote Server Mode (SSE): Deploy as a remote server that multiple clients can connect to
pnpm start:sse
)Local Mode (stdio): Run locally for development and testing
(pnpm start
)
Key Features:
While the Model Context Protocol (MCP) is in its early stages, one thing is clear: rapid adoption requires frictionless development. This implementation is built with a singular focus: letting developers focus on building great tools, not infrastructure.
# Option 1: One-Click Deploy Click the "Deploy on Railway" button above ☝️ # Option 2: Local Setup pnpm install && pnpm dev
// That's it. This is all you need to create a tool. export const myTool = new Tool( { name: "myTool", description: "What my tool does", inputSchema: z.object({ query: z.string() }), outputSchema: z.object({ result: z.string() }), }, async (args) => { // Your logic here return { result: "Done!" }; }, );
We handle:
You focus on:
pnpm install
Two modes are available:
pnpm dev # Development with hot reload pnpm start # Production
pnpm dev:sse # Development with hot reload pnpm start:sse # Production
When running in SSE mode, connect to: http://localhost:3001/sse
Tools are executable functions that models can invoke. Each tool:
Example tool:
import { z } from "zod"; import { Tool } from "../core"; const MyToolInputSchema = z.object({ param1: z.string().describe("Parameter description"), }); const MyToolOutputSchema = z.object({ result: z.string().describe("Result description"), }); export const myTool = new Tool( { name: "myTool", description: "What my tool does", inputSchema: MyToolInputSchema, outputSchema: MyToolOutputSchema, }, async (args) => { const input = MyToolInputSchema.parse(args); // Tool logic here return { result: "processed result" }; }, );
Prompts are message generators that help structure model interactions. Each prompt:
Example prompt:
import { Prompt } from "../core"; export const myPrompt = new Prompt( { name: "myPrompt", description: "What my prompt does", arguments: [ { name: "arg1", description: "Argument description", required: true, }, ], }, async (args) => { return [ { role: "system", content: { type: "text", text: `Generated message using ${args.arg1}`, }, }, ]; }, );
src/modules/tools/
src/modules/tools/index.ts
The registry will automatically:
src/modules/prompts/
src/modules/prompts/index.ts
The registry will automatically:
The system uses a singleton Registry pattern that:
The system includes robust error handling:
All components use TypeScript for full type safety:
Run tests using:
pnpm test
NEURAL INTERFACE DETECTED • INITIATING COLLABORATION PROTOCOLS • READY FOR UPLINK
We welcome contributions! Please see our Contributing Guide for details on:
Join our Discord community to connect with other contributors!
SUPPORT PROTOCOLS ACTIVE • COMMUNICATION CHANNELS OPEN • READY TO ASSIST
This project is licensed under the MIT License - see the LICENSE file for details.
© 2025 Doug, at WithSeismic dot com.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!