Sakura Cloud MCP Server

Sakura Cloud MCP Server

Public
hidenorigoto/sacloud-mcp

Model Context Protocol (MCP) server enabling AI assistants to securely access and manage Sakura Cloud resources, including servers, networks, billing, and containerized applications via standardized API interactions.

javascript
46 tools
May 30, 2025
Updated Jun 4, 2025

Supercharge Your AI with Sakura Cloud MCP Server

MCP Server

Unlock the full potential of Sakura Cloud MCP Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests
Available Tools
This server provides 46 tools that can be used in your MCP Gateway
Related MCPs5
  • Hetzner Cloud MCP Server

    Model Context Protocol server enabling language models to manage cloud infrastructure by listing, creating, and controlling servers, volumes, firewalls, and SSH keys via a structured API with support for multiple transport modes and integration with Claude Code.

    Added May 30, 2025
  • GCP MCP Server

    Comprehensive Model Context Protocol server enabling AI assistants to securely query, manage, and receive guidance on Google Cloud Platform resources across key services like Compute Engine, BigQuery, and Cloud Storage.

    Added May 30, 2025
  • Linode MCP Server

    Model Context Protocol server enabling Large Language Models to securely manage cloud resources by listing, creating, rebooting, and deleting Linode instances through a standardized API interface.

    Added May 30, 2025
  • Redis Cloud API MCP Server

    Manage Redis Cloud resources using natural language with a Model Context Protocol server offering account, subscription, database, cloud provider, and task management capabilities for seamless integration with LLM clients.

    16 tools
    Added May 30, 2025
  • mcp-server-zep-cloud

    Provides a Model Context Protocol server bridging LLM clients with cloud APIs to manage semantic memory, user data, and contextual information for AI assistants across conversations.

    Added May 30, 2025