RagDocs MCP Server

RagDocs MCP Server

Public
heltonteixeira/ragdocs

Model Context Protocol server offering retrieval-augmented generation with semantic document search, management, and vector similarity using Qdrant and Ollama or OpenAI embeddings.

typescript
0 tools
May 30, 2025
Updated Jun 4, 2025

Supercharge Your AI with RagDocs MCP Server

MCP Server

Unlock the full potential of RagDocs MCP Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests
Configuration Requirements
API Key
Configure authentication and required variables to access this MCP server
Required Environment Variables
EMBEDDING_PROVIDER
Optional
string

Choice of embedding provider ('ollama' or 'openai')

Default: ollama
OPENAI_API_KEY
Optional
string

OpenAI API key (required if using OpenAI)

EMBEDDING_MODEL
Optional
string

Model to use for embeddings (Ollama defaults to 'nomic-embed-text', OpenAI defaults to 'text-embedding-3-small')

QDRANT_API_KEY
Optional
string

API key for Qdrant Cloud (required when using cloud instance)

QDRANT_URL
Optional
string

URL of your Qdrant instance (for local: 'http://127.0.0.1:6333', for cloud: 'https://your-cluster-url.qdrant.tech')

Default: http://127.0.0.1:6333

Security Notice

Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.

Related MCPs5
  • MCP Memory Server with Qdrant Persistence

    Provides a Model Context Protocol server offering graph-based knowledge representation with semantic search powered by OpenAI embeddings and Qdrant vector database, featuring file persistence, entity management, HTTPS support, and Docker deployment.

    Added May 30, 2025
  • RAG MCP server

    A Model Context Protocol (MCP) server enabling Retrieval-Augmented Generation with document ingestion, semantic search, local LLM integration via Ollama, and compatibility with RISC Zero's Bonsai documentation for advanced query processing.

    Added May 30, 2025
  • MCP Qdrant Server with OpenAI Embeddings

    Provides semantic vector search and collection management using Qdrant database integrated with OpenAI embeddings for enhanced natural language query capabilities within the Model Context Protocol framework.

    Added May 29, 2025
  • MCP-RAG Server

    Advanced Model Context Protocol server enabling efficient retrieval-augmented generation with high-accuracy document ingestion, semantic search, and seamless integration of GroundX and OpenAI for robust context handling and flexible configuration.

    Added May 29, 2025
  • MCP-RAG Server

    Advanced Model Context Protocol server enabling efficient retrieval-augmented generation with semantic search, PDF document ingestion, and seamless integration of GroundX and OpenAI for high-accuracy context processing.

    Added May 29, 2025