Related MCP Server Resources

Explore more AI models, providers, and integration options:

  • Explore AI Models
  • Explore AI Providers
  • Explore MCP Servers
  • LangDB Pricing
  • Documentation
  • AI Industry Blog
  • Ragie Model Context Protocol Server
  • TxtAi Memory Vector Server
  • Remote MCP Server
  • MCP Server for Intercom
  • Powertools MCP Search Server
Back to MCP Servers
Gemini Context MCP Server

Gemini Context MCP Server

Public
ogoldberg/gemini-context-mcp-server

Powerful Model Context Protocol server offering up to 2 million token context window support, session-based conversations, semantic context search, and efficient large prompt caching for optimized performance and cost reduction.

typescript
0 tools
May 30, 2025
Updated Jun 4, 2025

Supercharge Your AI with Gemini Context MCP Server

MCP Server

Unlock the full potential of Gemini Context MCP Server through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.

Unified API Access
Complete Tracing
Instant Setup
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests
Configuration Requirements
API Key
Configure authentication and required variables to access this MCP server
Required Environment Variables
GEMINI_API_KEY
Optional
string

Your Gemini API key

GEMINI_TEMPERATURE
Optional
string

Temperature setting for the Gemini model

Default: 0.7
SESSION_TIMEOUT_MINUTES
Optional
string

Session timeout in minutes

Default: 120
DEBUG
Optional
string

Enable debug mode

Default: false
MAX_TOKENS_PER_SESSION
Optional
string

Maximum tokens per session

Default: 2097152
MAX_SESSIONS
Optional
string

Maximum number of sessions

Default: 50
GEMINI_MAX_OUTPUT_TOKENS
Optional
string

Maximum number of output tokens

Default: 2097152
GEMINI_MODEL
Optional
string

The Gemini model to use

Default: gemini-2.0-flash
GEMINI_TOP_P
Optional
string

Top-P setting for the Gemini model

Default: 0.9
GEMINI_TOP_K
Optional
string

Top-K setting for the Gemini model

Default: 40
MAX_MESSAGE_LENGTH
Optional
string

Maximum message length

Default: 1000000

Security Notice

Your environment variables and credentials are securely stored and encrypted. LangDB never shares these configuration values with third parties.

Publicly Shared Threads0

Discover shared experiences

Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!

Share your threads to help others
Related MCPs5
  • Ragie Model Context Protocol Server
    Ragie Model Context Protocol Server

    Enables AI models to retrieve relevant information from a Ragie knowledge base using the Model Conte...

    1 tools
    Added May 30, 2025
  • TxtAi Memory Vector Server
    TxtAi Memory Vector Server

    Model Context Protocol server offering advanced semantic search, persistent memory management, tag-b...

    Added May 30, 2025
  • Remote MCP Server
    Remote MCP Server

    Remote Model Context Protocol server with Cloudflare Workers and Xano integration offering tool mana...

    Added May 30, 2025
  • MCP Server for Intercom
    MCP Server for Intercom

    Enables AI assistants to access, search, and filter Intercom customer support conversations and tick...

    4 tools
    Added May 30, 2025
  • Powertools MCP Search Server
    Powertools MCP Search Server

    Model Context Protocol server enabling efficient local search of AWS Lambda Powertools documentation...

    2 tools
    Added May 30, 2025