Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • MCP Agent Platform
  • argocd-mcp
  • MCP Unreal Server
  • Ollama MCP Server
  • GitHub MCP Server
Back to MCP Servers
Microservice Control Panel MCP

Microservice Control Panel MCP

Public
Chunkys0up7/MCP

A modular system for building and orchestrating AI applications through microservices, featuring LLM interactions, Jupyter notebook execution, and visual workflow capabilities.

Verified
javascript
0 tools
May 29, 2025
Updated May 30, 2025

Model Context Protocol (MCP)

Overview

MCP is a modular framework for managing, executing, and monitoring AI model contexts, including LLM prompts, Jupyter notebooks, and Python scripts. It provides a FastAPI backend and a Streamlit dashboard frontend.

Features

  • Register and manage different types of MCPs (LLM prompts, notebooks, scripts)
  • Execute MCPs and view results in a web UI
  • Monitor server health and statistics
  • Extensible for new MCP types

Setup

Prerequisites

  • Python 3.9+
  • (Recommended) Create and activate a virtual environment

Install dependencies

pip install -r requirements.txt

Environment Variables

  • Set MCP_API_KEY for API authentication (optional, defaults provided)
  • For LLMs, set ANTHROPIC_API_KEY if using Claude

Start the backend

uvicorn mcp.api.main:app --reload

Start the frontend

streamlit run mcp/ui/app.py

Usage

  • Access the dashboard at http://localhost:8501
  • Create, manage, and test MCPs from the UI
  • Monitor health and stats from the sidebar

Adding New MCPs

  • Implement a new MCP class in mcp/core/
  • Register it in the backend
  • Add UI support in mcp/ui/app.py

Running Tests

pytest

Project Structure

  • mcp/api/ - FastAPI backend
  • mcp/ui/ - Streamlit frontend
  • mcp/core/ - Core MCP types and logic
  • tests/ - Test suite

License

MIT

API Documentation

Once the server is running, you can access:

  • API documentation: http://localhost:8000/docs
  • Prometheus metrics: http://localhost:8000/metrics
  • Health check: http://localhost:8000/health
  • Statistics: http://localhost:8000/stats

Security

  • API key authentication is required for all endpoints
  • Rate limiting is enabled by default
  • CORS is configured to allow only specific origins
  • All sensitive configuration is managed through environment variables

Monitoring

The server includes:

  • Prometheus metrics for request counts, latencies, and server executions
  • Structured JSON logging
  • Health check endpoint
  • Server statistics endpoint

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

Additional Dependencies for Notebook and LLM Integration

This project now requires the following additional Python packages:

  • pandas
  • numpy
  • matplotlib
  • papermill
  • nbformat
  • jupyter
  • anthropic

Install all dependencies with:

pip install -r requirements.txt

Using the Notebook MCP to Call an LLM (Claude)

The example notebook (mcp/notebooks/example.ipynb) demonstrates:

  • Data analysis and plotting
  • Calling the Claude LLM via the anthropic Python package

To use the LLM cell, ensure you have set your ANTHROPIC_API_KEY in your environment or .env file.

The notebook cell for LLM looks like this:

import os import anthropic api_key = os.getenv('ANTHROPIC_API_KEY') if not api_key: raise ValueError('ANTHROPIC_API_KEY not set in environment!') client = anthropic.Anthropic(api_key=api_key) response = client.messages.create( model='claude-3-sonnet-20240229', max_tokens=256, temperature=0.7, messages=[ {'role': 'user', 'content': 'Tell me a joke about data science.'} ] ) print('Claude says:', response.content[0].text)
Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • MCP Agent Platform
    MCP Agent Platform

    A multi-agent human-computer interaction system that enables natural interaction through integrated ...

    Added May 30, 2025
  • argocd-mcp
    argocd-mcp

    An MCP (Model Context Protocol) server that integrates with the ArgoCD API, enabling AI assistants a...

    Added May 30, 2025
  • MCP Unreal Server
    MCP Unreal Server

    A server implementation that enables remote Python code execution in Unreal Engine environments, fea...

    Added May 30, 2025
  • Ollama MCP Server
    Ollama MCP Server

    Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supp...

    Added May 30, 2025
  • GitHub MCP Server
    GitHub MCP Server

    Connects Claude Desktop to GitHub repositories, enabling users to perform git operations and GitHub ...

    Added May 30, 2025