A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
The URL of the Ollama server
http://localhost:11434
The logging level (e.g., debug, info, warning, error, critical)
info
The default Ollama model to use
llama3
Alternative way to specify the default Ollama model to use
Security Notice
Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!