Bridge AI models with Steampipe using a Model Context Protocol server that enables executing Steampipe SQL queries via LLM tools for seamless data retrieval and integration.
Unlock the full potential of mcp-steampipe through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.
Free tier available • No credit card required
This is a simple steampipe MCP server. This acts as a bridge between your AI model and Steampipe tool.
This is an awesome tool for testing your if your MCP server is working as expected
npx -y @modelcontextprotocol/inspector uv --directory . run steampipe_mcp_server.py
{
"query": "select name, fork_count from github_my_repository "
}
Pretty straightforward. Just run the interceptor and make sure the tool is working from the directory. Then add the server configuration to the respective LLM and select the tool from the LLM.
tail -f ~/Library/Logs/Claude/mcp.log
tail -f ~/Library/Logs/Claude/mcp-server-steampipe.log
Security Risk Claude blindly executes your sql query in this POC so there is possibility to generate and execute arbitary SQL Queries via Steampipe using your configured credentials.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!