A Model Context Protocol (MCP) server enabling Retrieval-Augmented Generation with document ingestion, semantic search, local LLM integration via Ollama, and compatibility with RISC Zero's Bonsai documentation for advanced query processing.
Unlock the full potential of RAG MCP server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.
Free tier available • No credit card required
A general-purpose Retrieval-Augmented Generation (RAG) server using the Model Control Protocol (MCP), designed to be tested with RISC Zero's Bonsai documentation.
This project implements a RAG server that:
poetry install
# Install Ollama brew install ollama # for macOS # or curl -fsSL https://ollama.com/install.sh | sh # for Linux # Start Ollama service ollama serve
ollama pull llama2
poetry run python mcp_server.py
The server will:
Test with RISC Zero Bonsai docs:
data/
directorymcp_server.py
: Main server implementationrag.py
: RAG workflow implementationdata/
: Directory for document ingestionstorage/
: Vector store and document storagestart_ollama.sh
: Script to start Ollama serviceThe server is configured to work with RISC Zero's Bonsai documentation. You can:
data/
directoryDiscover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!