Lightweight Model Context Protocol server enabling vulnerability scanning of various AI models through customizable attacks, probes, and detailed reporting across multiple model types including Ollama, OpenAI, HuggingFace, and GGML.
Unlock the full potential of Garak-MCP through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.
Free tier available • No credit card required
A lightweight MCP (Model Context Protocol) server for Garak.
Example:
https://github.com/user-attachments/assets/f6095d26-2b79-4ef7-a889-fd6be27bbbda
Name | Description |
---|---|
list_model_types | List all available model types (ollama, openai, huggingface, ggml) |
list_models | List all available models for a given model type |
list_garak_probes | List all available Garak attacks/probes |
get_report | Get the report of the last run |
run_attack | Run an attack with a given model and probe |
list_model_types
list_models
model_type
(string, required): The type of model to list (ollama, openai, huggingface, ggml)list_garak_probes
get_report
run_attack
model_type
(string, required): The type of model to usemodel_name
(string, required): The name of the model to useprobe_name
(string, required): The name of the attack/probe to usePython 3.11 or higher: This project requires Python 3.11 or newer.
# Check your Python version python --version
Install uv: A fast Python package installer and resolver.
pip install uv
Or use Homebrew:
brew install uv
Optional: Ollama: If you want to run attacks on ollama models be sure that the ollama server is running.
ollama serve
git clone https://github.com/BIGdeadLock/Garak-MCP.git
{ "mcpServers": { "garak-mcp": { "command": "uv", "args": ["--directory", "path-to/Garak-MCP", "run", "garak-server"], "env": {} } } }
Tested on:
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!