Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • mcp-server-data-exploration
  • MCP Server for Apache Jena
  • mcphub
  • MCP Everything
  • FastMCP
Back to MCP Servers
Microservice Control Panel MCP

Microservice Control Panel MCP

Public
Chunkys0up7/MCP

Modular Model Context Protocol framework for managing, executing, and monitoring AI model contexts including LLM prompts, Jupyter notebooks, and Python scripts via a FastAPI backend and Streamlit dashboard with API authentication, health monitoring, and extensible MCP types.

javascript
0 tools
May 29, 2025
Updated Jun 4, 2025

Supercharge Your AI with Microservice Control Panel MCP

MCP Server

Unlock the full potential of Microservice Control Panel MCP through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests

Model Context Protocol (MCP)

Overview

MCP is a modular framework for managing, executing, and monitoring AI model contexts, including LLM prompts, Jupyter notebooks, and Python scripts. It provides a FastAPI backend and a Streamlit dashboard frontend.

Features

  • Register and manage different types of MCPs (LLM prompts, notebooks, scripts)
  • Execute MCPs and view results in a web UI
  • Monitor server health and statistics
  • Extensible for new MCP types

Setup

Prerequisites

  • Python 3.9+
  • (Recommended) Create and activate a virtual environment

Install dependencies

pip install -r requirements.txt

Environment Variables

  • Set MCP_API_KEY for API authentication (optional, defaults provided)
  • For LLMs, set ANTHROPIC_API_KEY if using Claude

Start the backend

uvicorn mcp.api.main:app --reload

Start the frontend

streamlit run mcp/ui/app.py

Usage

  • Access the dashboard at http://localhost:8501
  • Create, manage, and test MCPs from the UI
  • Monitor health and stats from the sidebar

Adding New MCPs

  • Implement a new MCP class in mcp/core/
  • Register it in the backend
  • Add UI support in mcp/ui/app.py

Running Tests

pytest

Project Structure

  • mcp/api/ - FastAPI backend
  • mcp/ui/ - Streamlit frontend
  • mcp/core/ - Core MCP types and logic
  • tests/ - Test suite

License

MIT

API Documentation

Once the server is running, you can access:

  • API documentation: http://localhost:8000/docs
  • Prometheus metrics: http://localhost:8000/metrics
  • Health check: http://localhost:8000/health
  • Statistics: http://localhost:8000/stats

Security

  • API key authentication is required for all endpoints
  • Rate limiting is enabled by default
  • CORS is configured to allow only specific origins
  • All sensitive configuration is managed through environment variables

Monitoring

The server includes:

  • Prometheus metrics for request counts, latencies, and server executions
  • Structured JSON logging
  • Health check endpoint
  • Server statistics endpoint

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

Additional Dependencies for Notebook and LLM Integration

This project now requires the following additional Python packages:

  • pandas
  • numpy
  • matplotlib
  • papermill
  • nbformat
  • jupyter
  • anthropic

Install all dependencies with:

pip install -r requirements.txt

Using the Notebook MCP to Call an LLM (Claude)

The example notebook (mcp/notebooks/example.ipynb) demonstrates:

  • Data analysis and plotting
  • Calling the Claude LLM via the anthropic Python package

To use the LLM cell, ensure you have set your ANTHROPIC_API_KEY in your environment or .env file.

The notebook cell for LLM looks like this:

import os import anthropic api_key = os.getenv('ANTHROPIC_API_KEY') if not api_key: raise ValueError('ANTHROPIC_API_KEY not set in environment!') client = anthropic.Anthropic(api_key=api_key) response = client.messages.create( model='claude-3-sonnet-20240229', max_tokens=256, temperature=0.7, messages=[ {'role': 'user', 'content': 'Tell me a joke about data science.'} ] ) print('Claude says:', response.content[0].text)
Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • mcp-server-data-exploration
    mcp-server-data-exploration

    Interactive Model Context Protocol server enabling seamless data exploration by loading CSV datasets...

    2 tools
    Added May 30, 2025
  • MCP Server for Apache Jena
    MCP Server for Apache Jena

    Connect AI agents via the Model Context Protocol to Apache Jena for executing and updating SPARQL qu...

    Added May 30, 2025
  • mcphub
    mcphub

    Manage and scale multiple Model Context Protocol (MCP) servers with a centralized dashboard, real-ti...

    Added May 30, 2025
  • MCP Everything
    MCP Everything

    Demonstrates comprehensive Model Context Protocol capabilities including prompts, tools, resources, ...

    Added May 30, 2025
  • FastMCP
    FastMCP

    Lightweight Model Context Protocol server enabling creation, retrieval, updating, deletion, and quer...

    Added May 30, 2025