Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • IR Toolshed MCP Server
  • GitHub MCP Server
  • S3 MCP Server
  • Phrases MCP Server
  • Gmail Plugin MCP Server
Back to MCP Servers
LMStudio-MCP

LMStudio-MCP

Public
infinitimeless/LMStudio-MCP

A bridge that allows Claude to communicate with locally running LLM models via LM Studio, enabling users to leverage their private models through Claude's interface.

Verified
python
0 tools
May 30, 2025
Updated May 30, 2025

LMStudio-MCP

A Model Control Protocol (MCP) server that allows Claude to communicate with locally running LLM models via LM Studio.

Overview

LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:

  • Check the health of your LM Studio API
  • List available models
  • Get the currently loaded model
  • Generate completions using your local models

This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.

Prerequisites

  • Python 3.7+
  • LM Studio installed and running locally with a model loaded
  • Claude with MCP access
  • Required Python packages (see Installation)

Installation

  1. Clone this repository:

    git clone https://github.com/infinitimeless/LMStudio-MCP.git cd LMStudio-MCP
  2. Install the required packages:

    pip install requests "mcp[cli]" openai

MCP Configuration

For Claude to connect to this bridge, you need to configure the MCP settings properly. You can either:

  1. Use directly from GitHub:

    { "lmstudio-mcp": { "command": "uvx", "args": [ "https://github.com/infinitimeless/LMStudio-MCP" ] } }
  2. Use local installation:

    { "lmstudio-mcp": { "command": "/bin/bash", "args": [ "-c", "cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py" ] } }

For detailed MCP configuration instructions, see MCP_CONFIGURATION.md.

Usage

  1. Start your LM Studio application and ensure it's running on port 1234 (the default)

  2. Load a model in LM Studio

  3. If running locally (not using uvx), run the LMStudio-MCP server:

    python lmstudio_bridge.py
  4. In Claude, connect to the MCP server when prompted by selecting "lmstudio-mcp"

Available Functions

The bridge provides the following functions:

  • health_check(): Verify if LM Studio API is accessible
  • list_models(): Get a list of all available models in LM Studio
  • get_current_model(): Identify which model is currently loaded
  • chat_completion(prompt, system_prompt, temperature, max_tokens): Generate text from your local model

Known Limitations

  • Some models (e.g., phi-3.5-mini-instruct_uncensored) may have compatibility issues
  • The bridge currently uses only the OpenAI-compatible API endpoints of LM Studio
  • Model responses will be limited by the capabilities of your locally loaded model

Troubleshooting

API Connection Issues

If Claude reports 404 errors when trying to connect to LM Studio:

  • Ensure LM Studio is running and has a model loaded
  • Check that LM Studio's server is running on port 1234
  • Verify your firewall isn't blocking the connection
  • Try using "127.0.0.1" instead of "localhost" in the API URL if issues persist

Model Compatibility

If certain models don't work correctly:

  • Some models might not fully support the OpenAI chat completions API format
  • Try different parameter values (temperature, max_tokens) for problematic models
  • Consider switching to a more compatible model if problems persist

For more detailed troubleshooting help, see TROUBLESHOOTING.md.

License

MIT

Acknowledgements

This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".

Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • IR Toolshed MCP Server
    IR Toolshed MCP Server

    A Model Context Protocol server that provides network analysis tools for security professionals, ena...

    Added May 30, 2025
  • GitHub MCP Server
    GitHub MCP Server

    Connects Claude Desktop to GitHub repositories, enabling users to perform git operations and GitHub ...

    Added May 30, 2025
  • S3 MCP Server
    S3 MCP Server

    An Amazon S3 Model Context Protocol server that allows Large Language Models like Claude to interact...

    3 tools
    Added May 30, 2025
  • Phrases MCP Server
    Phrases MCP Server

    An elegant MCP server that lets users manage inspirational phrases directly through Claude for Deskt...

    6 tools
    Added May 30, 2025
  • Gmail Plugin MCP Server
    Gmail Plugin MCP Server

    An MCP server that enables Gmail integration, allowing users to manage emails (send, receive, read, ...

    Added May 30, 2025