Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • OpenAPI MCP Server
  • Notion MCP Server
  • Code Analysis MCP Server
  • MongoDB MCP Server for LLMs
  • Azure Cosmos DB MCP Server
Back to MCP Servers
Binary Ninja MCP Server

Binary Ninja MCP Server

Public
rsprudencio/binja_mcp

Enables Large Language Models to interact with Binary Ninja via Model Context Protocol, providing functionalities like retrieving assembly and decompiled code, renaming functions and variables, and adding comments for enhanced binary analysis and automation.

python
0 tools
May 30, 2025
Updated Jun 4, 2025

Supercharge Your AI with Binary Ninja MCP Server

MCP Server

Unlock the full potential of Binary Ninja MCP Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests

Binary Ninja MCP Server

A Model Context Protocol server for Binary Ninja interaction and automation. This server provides tools to interact with Binary Ninja via Large Language Models.

Overview

The Binary Ninja MCP Server is a plugin and server implementation that allows Large Language Models to interact with Binary Ninja through the Model Context Protocol (MCP). It provides functionalities such as:

  • Get assembly code for functions
  • Get decompiled code (HLIL) for functions
  • Rename functions and variables
  • Add comments

Installation

Using uv (recommended)

When using uv no specific installation is needed. We will use uvx to directly run binja_mcp.

Using PIP

Alternatively you can install binja-mcp via pip:

pip install binja-mcp

After installation, you can run it as a script using:

python -m binja_mcp

Binary Ninja Plugin Installation

Clone this repository OR link the cloned repository into Binary Ninja's plugin directory:

  • Linux: ~/.binaryninja/plugins/
  • macOS: ~/Library/Application Support/Binary Ninja/plugins/
  • Windows: %APPDATA%\Binary Ninja\plugins\

Configuration

Usage with Claude Desktop/Cursor

Add this to your claude_desktop_config.json or Cursor MCP servers:

Using uvx

"mcpServers": { "binja": { "command": "uvx", "args": [ "-n", "mcp-server-binja" ] } }

Using pip installation

"mcpServers": { "binja": { "command": "python", "args": [ "-m", "mcp_server_binja" ] } }

Usage

  1. Open Binary Ninja and load a binary
  2. Start the MCP Server from the Tools menu or using the keyboard shortcut
  3. Use Claude Desktop, Cursor, or any MCP client of your preference to interact with the binary

Available Commands

The following commands are available through the MCP interface:

  • binja_get_function_assembly: Get assembly code for a named function
  • binja_get_function_decompiled: Get decompiled code for a named function
  • binja_get_global_variable: Get information about a global variable
  • binja_get_current_function_assembly: Get assembly for the current function
  • binja_get_current_function_decompiled: Get decompiled code for the current function

Development

If you are doing local development, there are two ways to test your changes:

  1. Run the MCP inspector to test your changes:
npx @modelcontextprotocol/inspector uvx binja_mcp
  1. Test using the Claude desktop app by adding the following to your claude_desktop_config.json:
{ "mcpServers": { "binja": { "command": "uv", "args": [ "--directory", "//src", "run", "mcp-server-binja" ] } } }

License

This project is licensed under the MIT License - see the LICENSE file for details.

Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • OpenAPI MCP Server
    OpenAPI MCP Server

    Enables large language models to interact with REST APIs by providing configured REST API context an...

    Added May 30, 2025
  • Notion MCP Server
    Notion MCP Server

    Enables AI models to seamlessly interact with Notion workspaces via the official SDK, offering compr...

    17 tools
    Added May 30, 2025
  • Code Analysis MCP Server
    Code Analysis MCP Server

    Enables AI-driven natural language exploration and analysis of codebases via Model Context Protocol,...

    4 tools
    Added May 30, 2025
  • MongoDB MCP Server for LLMs
    MongoDB MCP Server for LLMs

    Enables LLMs to seamlessly interact with MongoDB databases via Model Context Protocol, offering sche...

    Added May 30, 2025
  • Azure Cosmos DB MCP Server
    Azure Cosmos DB MCP Server

    Enables seamless, secure interaction between AI language models and Azure Cosmos DB by translating n...

    Added May 30, 2025