A specialized Model Context Protocol server that enables AI systems to generate detailed, well-structured product requirement documents through a standardized interface.
A specialized Model Context Protocol (MCP) server dedicated to creating Product Requirements Documents. This MCP server enables AI systems connected to MCP clients to generate detailed, well-structured product requirement documents through a standardized protocol interface.
Via NPX (recommended):
npx -y prd-creator-mcp
Via Docker:
docker pull saml1211/prd-creator-mcp docker run -i --rm saml1211/prd-creator-mcp
Configure Providers:
.env.example
to .env
and set your API keys and preferred models.update_provider_config
MCP tool.Get Help:
npx prd-creator-mcp --help
git clone https://github.com/Saml1211/prd-mcp-server.git cd prd-mcp-server
npm install
npm run build
npm start
npm run dev
The PRD Creator MCP Server provides the following tools:
generate_prd
Generate a complete PRD document using AI or template-based generation.
Parameters:
productName
: The name of the productproductDescription
: Description of the producttargetAudience
: Description of the target audiencecoreFeatures
: Array of core feature descriptionsconstraints
(optional): Array of constraints or limitationstemplateName
(optional): Template name to use (defaults to "standard")providerId
(optional): Specific AI provider to use (openai, anthropic, gemini, local, template)additionalContext
(optional): Additional context or instructions for the AI providerproviderOptions
(optional): Provider-specific options like temperature, maxTokens, etc.Example:
{ "productName": "TaskMaster Pro", "productDescription": "A task management application that helps users organize and prioritize their work efficiently.", "targetAudience": "Busy professionals and teams who need to manage multiple projects and deadlines.", "coreFeatures": [ "Task creation and management", "Priority setting", "Due date tracking", "Team collaboration" ], "constraints": [ "Must work offline", "Must support mobile and desktop platforms" ], "templateName": "comprehensive", "providerId": "openai", "additionalContext": "Focus on enterprise features and security", "providerOptions": { "temperature": 0.5, "maxTokens": 4000 } }
validate_prd
Validate a PRD document against best practices.
Parameters:
prdContent
: The PRD content to validatevalidationRules
(optional): Array of validation rule IDs to checkExample:
{ "prdContent": "# My Product ## Introduction ...", "validationRules": ["has-introduction", "minimum-length"] }
list_validation_rules
List all available validation rules.
list_ai_providers
List all available AI providers and their availability status.
Example response:
[ { "id": "openai", "name": "OpenAI", "available": true }, { "id": "anthropic", "name": "Anthropic Claude", "available": false }, { "id": "gemini", "name": "Google Gemini", "available": false }, { "id": "local", "name": "Local Model", "available": false }, { "id": "template", "name": "Template-based (No AI)", "available": true } ]
The server provides additional tools for template management:
create_template
: Create a new PRD templatelist_templates
: List all available templatesget_template
: Get a specific templateupdate_template
: Update an existing templatedelete_template
: Delete a templateexport_templates
: Export templates to JSONimport_templates
: Import templates from JSONrender_template
: Render a template with placeholdersget_provider_config
: Get current provider configurationupdate_provider_config
: Update provider configurationhealth_check
: Check system health and provider availabilityget_logs
: Get recent system logsstats
: Get usage statisticsYou can configure provider credentials and models in two ways:
.env
file in your project or working directory. Use .env.example
as a template. All standard AI provider variables (e.g., OPENAI_API_KEY
, OPENAI_MODEL
, etc.) are supported.update_provider_config
tool via your MCP client. These changes are persisted and take effect immediately—no server restart required.The server will always merge persistent config (from protocol tools) with environment variables, giving precedence to protocol/tool updates.
When you update provider settings using either method, changes take effect instantly for all new requests. This enables:
Add to claude_desktop_config.json
:
{ "mcpServers": { "prd-creator": { "command": "npx", "args": ["-y", "prd-creator-mcp"] } } }
Available at: https://glama.ai/mcp/servers/@Saml1211/PRD-MCP-Server
Add to your Cursor MCP client configuration:
{ "mcpServers": { "prd-creator": { "command": "npx", "args": ["-y", "prd-creator-mcp"] } } }
Add to .roo/mcp.json
:
{ "mcpServers": { "prd-creator-mcp": { "command": "npx", "args": ["-y", "prd-creator-mcp"] } } }
Reference prd-creator-mcp
in your MCP workflow definitions.
You may also install the MCP server globally to expose the CLI:
npm install -g prd-creator-mcp
Then run:
prd-creator-mcp
prd-creator-mcp
Runs the MCP server (STDIO transport).
Use directly via npx or as a globally installed CLI for integration with MCP clients and tools.To remove the global CLI:
npm uninstall -g prd-creator-mcp
View available command line options:
npx prd-creator-mcp --help
docker build -t prd-creator-mcp .
docker run -i --rm prd-creator-mcp
docker run -i --rm -e OPENAI_API_KEY=your_key_here prd-creator-mcp
Please read CONTRIBUTING.md and CODE_OF_CONDUCT.md before submitting issues or pull requests.
All notable changes to this project are documented in CHANGELOG.md.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!