Implements a RAG workflow that integrates with any custom knowledge base and can be triggered directly from the Cursor IDE.
A general-purpose Retrieval-Augmented Generation (RAG) server using the Model Control Protocol (MCP), designed to be tested with RISC Zero's Bonsai documentation.
This project implements a RAG server that:
poetry install
# Install Ollama brew install ollama # for macOS # or curl -fsSL https://ollama.com/install.sh | sh # for Linux # Start Ollama service ollama serve
ollama pull llama2
poetry run python mcp_server.py
The server will:
Test with RISC Zero Bonsai docs:
data/
directorymcp_server.py
: Main server implementationrag.py
: RAG workflow implementationdata/
: Directory for document ingestionstorage/
: Vector store and document storagestart_ollama.sh
: Script to start Ollama serviceThe server is configured to work with RISC Zero's Bonsai documentation. You can:
data/
directoryDiscover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!