A server monitoring and inspection tool collection that provides remote utilities for checking network interfaces, service statuses, and firewall configurations on servers.
ops-mcp-server
: an AI-driven IT operations platform that fuses LLMs and MCP architecture to enable intelligent monitoring, anomaly detection, and natural human-infrastructure interaction with enterprise-grade security and scalability.
ops-mcp-server
is an IT operations management solution for the AI era. It achieves intelligent IT operations through the seamless integration of the Model Context Protocol (MCP) and Large Language Models (LLMs). By leveraging the power of LLMs and MCP's distributed architecture, it transforms traditional IT operations into an AI-driven experience, enabling automated server monitoring, intelligent anomaly detection, and context-aware troubleshooting. The system acts as a bridge between human operators and complex IT infrastructure, providing natural language interaction for tasks ranging from routine maintenance to complex problem diagnosis, while maintaining enterprise-grade security and scalability.
On Cherry Studio
On Terminal
Ensure you have Python 3.10+ installed. This project uses uv
for dependency and environment management.
curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv .venv # Activate the environment source .venv/bin/activate # Linux/macOS .\.venv\Scripts\activate # Windows
uv pip install -r requirements.txt
Dependencies are managed via
pyproject.toml
.
cd server_monitor_sse # Install dependencies pip install -r requirements.txt # Start service cd .. uv run server_monitor_sse --transport sse --port 8000
Ensure Docker and Docker Compose are installed.
cd server_monitor_sse docker compose up -d # Check status docker compose ps # Logs monitoring docker compose logs -f
Add this configuration to your MCP settings:
{ "ops-mcp-server": { "command": "uv", "args": [ "--directory", "YOUR_PROJECT_PATH_HERE", "run", "server_monitor.py" ], "env": {}, "disabled": true, "autoApprove": ["list_available_tools"] }, "network_tools": { "command": "uv", "args": [ "--directory", "YOUR_PROJECT_PATH_HERE", "run", "network_tools.py" ], "env": {}, "disabled": false, "autoApprove": [] }, }
Note: Replace
YOUR_PROJECT_PATH_HERE
with your project's actual path.
An interactive client (client.py
) allows you to interact with MCP services using natural language.
uv pip install openai rich
Edit these configurations within client.py
:
# Initialize OpenAI client self.client = AsyncOpenAI( base_url="https://your-api-endpoint", api_key="YOUR_API_KEY" ) # Set model self.model = "your-preferred-model"
uv run client.py [path/to/server.py]
Example:
uv run client.py ./server_monitor.py
help
- Display help.quit
- Exit client.clear
- Clear conversation history.model
- Switch models.This project is licensed under the MIT License.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!