A conversational application server that integrates LLM capabilities via Ollama with vector memory context, supporting multiple users, sessions, automatic history summarization, and a plugin system for executing real actions.
Path to the database file
./mcp.db
Path to the vector database
./chroma
Number of messages that triggers a summary
20
The name of the model to use
mistral
Context limit for the conversation
5
Security Notice
Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!