Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • GitHub MCP Server
  • OpenSearch MCP Server
  • MongoDB MCP Server for LLMs
  • MCP Server
  • Sanity MCP Server
Back to MCP Servers
MCP Kubernetes Server

MCP Kubernetes Server

Public
abhijeetka/mcp-k8s-server

Provides seamless Kubernetes cluster management through Model Context Protocol, enabling natural language interactions with LLMs to perform deployments, scaling, monitoring, and resource control via type-safe, context-aware commands.

python
0 tools
May 29, 2025
Updated Jun 4, 2025

Supercharge Your AI with MCP Kubernetes Server

MCP Server

Unlock the full potential of MCP Kubernetes Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests

MCP Kubernetes Server

This is an MCP (Model Context Protocol) server for Kubernetes that provides control over Kubernetes clusters through interactions with LLMs.

Overview

This client allows you to perform common Kubernetes operations through MCP tools. It wraps kubectl commands to provide a simple interface for managing Kubernetes resources. The Model Context Protocol (MCP) enables seamless interaction between language models and Kubernetes operations.

What is MCP?

Model Context Protocol (MCP) is a framework that enables Language Models to interact with external tools and services in a structured way. It provides:

  • A standardized way to expose functionality to language models
  • Context management for operations
  • Tool discovery and documentation
  • Type-safe interactions between models and tools

Usage Examples

  • Create a new deployment for me with name nginx-app and image nginx:latest in the production namespace with 3 replicas.
  • Update the deployment nginx-app to version 1.19 in the production namespace.
  • Scale the deployment nginx-app to 5 replicas in the production namespace.
  • Get me the pods in the production namespace.
  • Get me all namespaces in the cluster.
  • Get me all nodes in the cluster.
  • Get me all services in the cluster.
  • Get me all deployments in the cluster.
  • Get me all jobs in the cluster.
  • Get me all cronjobs in the cluster.
  • Get me all statefulsets in the cluster.
  • Get me all daemonsets in the cluster.
  • What is the current context.
  • list all contexts.
  • switch to context .
  • Get me the logs of pod in the production namespace.
  • Get me the events in the production namespace.
  • annotate pod with key1=value1 in the production namespace.
  • remove annotation key1 from pod in the production namespace.
  • add label key1=value1 to pod in the production namespace.
  • remove label key1 from pod in the production namespace.
  • expose deployment nginx-app in the production namespace on port 80.
  • port-forward pod,deployment,service with name in the production namespace to local port 8080.
  • delete pod, deployment, service, job, cronjob, statefulset, daemonset with name in the production namespace.

Upcoming Features

  • Create cluster role.
  • delete cluster role.
  • create cluster role binding.
  • delete cluster role binding.
  • create namespace.
  • delete namespace.
  • create service account.
  • delete service account.
  • create role.
  • delete role.
  • create role binding.a
  • delete role binding.

LLM Integration

This MCP client is designed to work seamlessly with Large Language Models (LLMs). The functions are decorated with @mcp.tool(), making them accessible to LLMs through the Model Context Protocol framework.

Example LLM Prompts

LLMs can interact with your Kubernetes cluster using natural language. Here are some example prompts:

  • "Create a new nginx deployment with 3 replicas in the production namespace"
  • "Scale the nginx-app deployment to 5 replicas"
  • "Update the image of nginx-app to version 1.19"

The LLM will interpret these natural language requests and call the appropriate MCP functions with the correct parameters.

Benefits of LLM Integration

  1. Natural Language Interface: Manage Kubernetes resources using conversational language
  2. Reduced Command Complexity: No need to remember exact kubectl syntax
  3. Error Prevention: LLMs can validate inputs and provide helpful error messages
  4. Context Awareness: LLMs can maintain context across multiple operations
  5. Structured Interactions: MCP ensures type-safe and documented interactions between LLMs and tools

Requirements

  • Kubernetes cluster access configured via kubectl
  • Python 3.x
  • MCP framework installed and configured

Security Note

When using this client with LLMs, ensure that:

  • Proper access controls are in place for your Kubernetes cluster
  • The MCP server is running in a secure environment
  • API access is properly authenticated and authorized

Usage with Claude Desktop

{
    "mcpServers": {
        "Kubernetes": {
            "command": "uv",
            "args": [
                "--directory",
                "~/mcp/mcp-k8s-server",
                "run",
                "kubernetes.py"
            ]
        }
    }
}

Contributing

We welcome contributions to the MCP Kubernetes Server! If you'd like to contribute:

  1. Fork the repository
  2. Create a new branch for your feature (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Write or update tests as needed
  5. Commit your changes (git commit -m 'Add some amazing feature')
  6. Push to your branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

For major changes, please open an issue first to discuss what you would like to change.

Installing via Smithery

To install Kubernetes Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @abhijeetka/mcp-k8s-server --client claude
Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • GitHub MCP Server
    GitHub MCP Server

    Enhance Claude Desktop with seamless GitHub integration via Model Context Protocol, enabling natural...

    Added May 30, 2025
  • OpenSearch MCP Server
    OpenSearch MCP Server

    Model Context Protocol server enabling seamless interaction with Opensearch for document search, ind...

    6 tools
    Added May 30, 2025
  • MongoDB MCP Server for LLMs
    MongoDB MCP Server for LLMs

    Enables LLMs to seamlessly interact with MongoDB databases via Model Context Protocol, offering sche...

    Added May 30, 2025
  • MCP Server
    MCP Server

    Provides greeting-related tools, resources, and prompts via Model Context Protocol (MCP), enabling p...

    Added May 30, 2025
  • Sanity MCP Server
    Sanity MCP Server

    Connect Sanity projects with AI tools via the Model Context Protocol to enable natural language cont...

    Added May 30, 2025