A custom Model Context Protocol server that gives Claude Desktop and other LLMs access to file system operations and command execution capabilities through standardized tool interfaces.
A custom Model Context Protocol (MCP) server implementation that provides file system and command execution tools for Claude Desktop and other LLM clients.
The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). Much like a USB-C port provides a standardized way to connect devices to various peripherals, MCP provides a standardized way to connect AI models to different data sources and tools.
This project implements a FastMCP server with several useful tools that enable Claude and other LLMs to interact with your local file system and execute commands. It extends LLMs' capabilities with local system access in a controlled way through well-defined tool interfaces.
The MCP server provides the following file system and command execution tools:
MCP follows a client-server architecture:
uv run mcp install
to install the MCP serverwhich uv
to get an absolute path to the uv
executableuv
executableMy MCP server configuration looks like this:
{ "globalShortcut": "", "mcpServers": { "zbigniew-mcp": { "command": "/Users/zbigniewtomanek/.local/bin/uv", "args": [ "run", "--with", "mcp[cli]", "--with", "marker-pdf", "mcp", "run", "/Users/zbigniewtomanek/PycharmProjects/my-mcp-tools/server.py" ] } } }
Note: While this implementation focuses on Claude Desktop, MCP is designed to be compatible with any MCP-compatible tool or LLM client, providing flexibility in implementation and integration.
Execute shell commands safely using a list of arguments:
execute_shell_command(["ls", "-la"]) execute_shell_command(["grep", "-r", "TODO", "./src"]) execute_shell_command(["python", "analysis.py", "--input", "data.csv"]) execute_shell_command(["uname", "-a"])
View file contents with optional line range specification:
show_file("/path/to/file.txt") show_file("/path/to/file.txt", num_lines=10) show_file("/path/to/file.txt", start_line=5, num_lines=10)
Search for patterns in files using regular expressions:
search_in_file("/path/to/script.py", r"def\s+\w+\s*\(") search_in_file("/path/to/code.py", r"#\s*TODO", case_sensitive=False)
Make precise changes to files:
# Replace text edit_file("config.json", replacements={""debug": false": ""debug": true"}) # Insert at line 5 edit_file("script.py", line_operations=[{"operation": "insert", "line": 5, "content": "# New comment"}]) # Delete lines 10-15 edit_file("file.txt", line_operations=[{"operation": "delete", "start_line": 10, "end_line": 15}]) # Replace line 20 edit_file("file.txt", line_operations=[{"operation": "replace", "line": 20, "content": "Updated content"}])
Write or append content to files:
# Overwrite file write_file("/path/to/file.txt", "New content") # Append to file write_file("/path/to/log.txt", "Log entry", mode="a")
Fetch the contents of a web page to a PDF (requires chromium installed) and then parses it to markdown using local LLMs:
fetch_page("https://example.com")
MCP supports multiple transport methods for communication between clients and servers:
This implementation uses a local MCP server that communicates via text input/output.
You can easily extend this MCP server by adding new tools with the @mcp.tool
decorator. Follow the pattern in server.py to create new tools that expose additional functionality to your LLM clients.
The MCP server provides Claude with access to your local system. Be mindful of the following:
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!