An MCP server implementation that provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context. Uses Ollama or OpenAI to generate embeddings. Docker files included
API key for authenticating with Qdrant (if applicable)
URL of your Qdrant vector database instance
API key for OpenAI (required when EMBEDDINGS_PROVIDER is 'openai')
Base URL for Ollama service (used when EMBEDDINGS_PROVIDER is 'ollama')
http://127.0.0.1:11434
Provider to use for generating embeddings
ollama
Security Notice
Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!