Manages user-specific context for LLM interactions using in-memory storage and RESTful APIs, supporting the Model Context Protocol with efficient prompt history handling and TypeScript integration.
Unlock the full potential of Memory Context Provider Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.
Free tier available • No credit card required
A server that manages context for LLM interactions, storing and providing relevant context for each user.
Install dependencies:
npm install
Start the development server:
npm run dev
Add a new prompt to user's context and get updated context.
Request body:
{ "prompt": "Your prompt here" }
Response:
{ "context": "Combined context from last 5 prompts" }
Get current context for a user.
Response:
{ "context": "Current context" }
Clear context for a user.
Response:
{ "message": "Context cleared" }
npm run dev
: Start development server with hot reloadnpm run build
: Build TypeScript filesnpm start
: Run built filesDiscover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!