A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
Ollama model to use for embeddings
nomic-embed-text
URL of your Supabase project
Service key for your Supabase project
Ollama model to use for text generation
llama3
URL of your Ollama instance
http://localhost:11434
Server transport mode - 'http' or 'stdio'
http
The port number for the HTTP server
3000
Security Notice
Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!