A server that converts Allure test reports into LLM-friendly formats, enabling AI models to better analyze test results and provide insights about test failures and potential fixes.
MCP-Allure is a MCP server that reads Allure reports and returns them in LLM-friendly formats.
As AI and Large Language Models (LLMs) become increasingly integral to software development, there is a growing need to bridge the gap between traditional test reporting and AI-assisted analysis. Traditional Allure test report formats, while human-readable, aren't optimized for LLM consumption and processing.
MCP-Allure addresses this challenge by transforming Allure test reports into LLM-friendly formats. This transformation enables AI models to better understand, analyze, and provide insights about test results, making it easier to:
By optimizing test reports for LLM consumption, MCP-Allure helps development teams leverage the full potential of AI tools in their testing workflow, leading to more efficient and intelligent test analysis and maintenance.
To install MCP-Allure for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @crisschan/mcp-allure --client claude
To install mcp-repo2llm using uv:
{
"mcpServers": {
"mcp-allure-server": {
"command": "uv",
"args": [
"run",
"--with",
"mcp[cli]",
"mcp",
"run",
"/Users/crisschan/workspace/pyspace/mcp-allure/mcp-allure-server.py"
]
}
}
}
{
"test-suites": [
{
"name": "test suite name",
"title": "suite title",
"description": "suite description",
"status": "passed",
"start": "timestamp",
"stop": "timestamp",
"test-cases": [
{
"name": "test case name",
"title": "case title",
"description": "case description",
"severity": "normal",
"status": "passed",
"start": "timestamp",
"stop": "timestamp",
"labels": [
],
"parameters": [
],
"steps": [
{
"name": "step name",
"title": "step title",
"status": "passed",
"start": "timestamp",
"stop": "timestamp",
"attachments": [
],
"steps": [
]
}
]
}
]
}
]
}
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!