Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • Azure Cosmos DB MCP Server
  • Root Signals MCP Server
  • MindManager MCP Server
  • Ragie Model Context Protocol Server
  • Image Generation MCP Server
Back to MCP Servers
LLM Responses MCP Server

LLM Responses MCP Server

Public
kstrikis/ephor-mcp-collaboration

Enables collaborative multi-turn debates between AI agents using the Model Context Protocol, supporting participant registration, real-time response sharing, deliberative consensus, and session status tracking for enhanced LLM interaction.

typescript
0 tools
May 30, 2025
Updated Jun 4, 2025

Supercharge Your AI with LLM Responses MCP Server

MCP Server

Unlock the full potential of LLM Responses MCP Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests

LLM Responses MCP Server

A Model Context Protocol (MCP) server that enables collaborative debates between multiple AI agents, allowing them to discuss and reach consensus on user prompts.

Overview

This project implements an MCP server that facilitates multi-turn conversations between LLMs with these key features:

  1. Session-based collaboration - LLMs can register as participants in a debate session
  2. Deliberative consensus - LLMs can engage in extended discussions to reach agreement
  3. Real-time response sharing - All participants can view and respond to each other's contributions

The server provides four main tool calls:

  1. register-participant: Allows an LLM to join a collaboration session with its initial response
  2. submit-response: Allows an LLM to submit follow-up responses during the debate
  3. get-responses: Allows an LLM to retrieve all responses from other LLMs in the session
  4. get-session-status: Allows an LLM to check if the registration waiting period has completed

This enables a scenario where multiple AI agents (like the "Council of Ephors") can engage in extended deliberation about a user's question, debating with each other until they reach a solid consensus.

Installation

# Install dependencies bun install

Development

# Build the TypeScript code bun run build # Start the server in development mode bun run dev

Testing with MCP Inspector

The project includes support for the MCP Inspector, which is a tool for testing and debugging MCP servers.

# Run the server with MCP Inspector bun run inspect

The inspect script uses npx to run the MCP Inspector, which will launch a web interface in your browser for interacting with your MCP server.

This will allow you to:

  • Explore available tools and resources
  • Test tool calls with different parameters
  • View the server's responses
  • Debug your MCP server implementation

Usage

The server exposes two endpoints:

  • /sse - Server-Sent Events endpoint for MCP clients to connect
  • /messages - HTTP endpoint for MCP clients to send messages

MCP Tools

register-participant

Register as a participant in a collaboration session:

// Example tool call const result = await client.callTool({ name: 'register-participant', arguments: { name: 'Socrates', prompt: 'What is the meaning of life?', initial_response: 'The meaning of life is to seek wisdom through questioning...', persona_metadata: { style: 'socratic', era: 'ancient greece' } // Optional } });

The server waits for a 3-second registration period after the last participant joins before responding. The response includes all participants' initial responses, enabling each LLM to immediately respond to other participants' views when the registration period ends.

submit-response

Submit a follow-up response during the debate:

// Example tool call const result = await client.callTool({ name: 'submit-response', arguments: { sessionId: 'EPH4721R-Socrates', // Session ID received after registration prompt: 'What is the meaning of life?', response: 'In response to Plato, I would argue that...' } });

get-responses

Retrieve all responses from the debate session:

// Example tool call const result = await client.callTool({ name: 'get-responses', arguments: { sessionId: 'EPH4721R-Socrates', // Session ID received after registration prompt: 'What is the meaning of life?' // Optional } });

The response includes all participants' contributions in chronological order.

get-session-status

Check if the registration waiting period has elapsed:

// Example tool call const result = await client.callTool({ name: 'get-session-status', arguments: { prompt: 'What is the meaning of life?' } });

Collaborative Debate Flow

  1. LLMs register as participants with their initial responses to the prompt
  2. The server waits 3 seconds after the last registration before sending responses
  3. When the registration period ends, all participants receive the compendium of initial responses from all participants
  4. Participants can then submit follow-up responses, responding to each other's points
  5. The debate continues until the participants reach a consensus or a maximum number of rounds is reached

License

MIT

Deployment to EC2

This project includes Docker configuration for easy deployment to EC2 or any other server environment.

Prerequisites

  • An EC2 instance running Amazon Linux 2 or Ubuntu
  • Security group configured to allow inbound traffic on port 62887
  • SSH access to the instance

Deployment Steps

  1. Clone the repository to your EC2 instance:

    git clone cd
  2. Make the deployment script executable:

    chmod +x deploy.sh
  3. Run the deployment script:

    ./deploy.sh

The script will:

  • Install Docker and Docker Compose if they're not already installed
  • Build the Docker image
  • Start the container in detached mode
  • Display the public URL where your MCP server is accessible

Manual Deployment

If you prefer to deploy manually:

  1. Build the Docker image:

    docker-compose build
  2. Start the container:

    docker-compose up -d
  3. Verify the container is running:

    docker-compose ps

Accessing the Server

Once deployed, your MCP server will be accessible at:

  • http://:62887/sse - SSE endpoint
  • http://:62887/messages - Messages endpoint

Make sure port 62887 is open in your EC2 security group!

Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • Azure Cosmos DB MCP Server
    Azure Cosmos DB MCP Server

    Enables seamless, secure interaction between AI language models and Azure Cosmos DB by translating n...

    Added May 30, 2025
  • Root Signals MCP Server
    Root Signals MCP Server

    Bridges Root Signals API with Model Context Protocol clients to enable AI assistants and agents to p...

    Added May 30, 2025
  • MindManager MCP Server
    MindManager MCP Server

    Enables programmatic interaction with MindManager on Windows and macOS via the Model Context Protoco...

    9 tools
    Added May 30, 2025
  • Ragie Model Context Protocol Server
    Ragie Model Context Protocol Server

    Enables AI models to retrieve relevant information from a Ragie knowledge base using the Model Conte...

    1 tools
    Added May 30, 2025
  • Image Generation MCP Server
    Image Generation MCP Server

    Enables high-quality image generation via Model Context Protocol with customizable dimensions, promp...

    1 tools
    Added May 30, 2025