A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.
Optional embedding model. Defaults to 'nomic-embed-text' for Ollama and 'text-embedding-3-small' for OpenAI.
URL of your Ollama instance, defaults to http://localhost:11434.
http://localhost:11434
Your OpenAI API key, required if using OpenAI as the embedding provider.
Your Qdrant Cloud API key, required if using Qdrant Cloud.
URL of your Qdrant instance. For local use: http://localhost:6333. For Qdrant Cloud: https://your-cluster-url.qdrant.tech
Choose between 'ollama' (default) or 'openai' for the embedding provider.
ollama
Security Notice
Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!