An MCP server that implements a conversational AI 'waifu' character using a text generation service with Redis queuing and GPU acceleration.
Your Google Gemini API key (fallback if ~/.api-gemini file is not present)
Optional, related to Flask if used elsewhere, not core to the MCP server/worker operation
The URL of your Redis server
redis://localhost:6379
Optional, related to Flask if used elsewhere, not core to the MCP server/worker operation
Maximum number of tokens for the Gemini response
2048
Security Notice
Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!