Connect local large language models to real-world tools, knowledge bases, and APIs via Model Context Protocol (MCP) for automated tool execution, knowledge retrieval, and seamless integration using HTTP, OpenAI SDK, stdio, or SSE transports.
Unlock the full potential of LLM Tool-Calling Assistant through LangDB's AI Gateway. Get enterprise-grade security, analytics, and seamless integration with zero configuration.
Free tier available β’ No credit card required
π§ LLM Tool-Calling Assistant with MCP Integration
Connect your local LLM to real-world tools, knowledge bases, and APIs via MCP.
This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
data.json
)stdio
and sse
transportsFile | Description |
---|---|
server.py | Registers tools and starts MCP server |
client-http.py | Uses aiohttp to communicate with local LLM |
clientopenai.py | Uses OpenAI-compatible SDK for LLM + tool call logic |
client-stdio.py | MCP client using stdio |
client-see.py | MCP client using SSE |
data.json | Q&A knowledge base |
Python 3.8+
Install dependencies:
pip install -r requirements.txt
requirements.txt
aiohttp==3.11.18
nest_asyncio==1.6.0
python-dotenv==1.1.0
openai==1.77.0
mcp==1.6.0
python server.py
This launches your tool server with functions like add
, multiply
, and get_knowledge_base
.
python client-http.py
python client-openai.py
python client-stdio.py
Make sure server.py
sets:
transport = "sse"
Then run:
python client-sse.py
What is 8 times 3?
Response:
Eight times three is 24.
What are the healthcare benefits available to employees in Singapore?
Response will include the relevant answer from data.json
.
data.json
[ { "question": "What is Singapore's public holiday schedule?", "answer": "Singapore observes several public holidays..." }, { "question": "How do I apply for permanent residency in Singapore?", "answer": "Submit an online application via the ICA website..." } ]
Inside client-http.py
or clientopenai.py
, update the following:
LOCAL_LLM_URL = "..." TOKEN = "your-api-token" LOCAL_LLM_MODEL = "your-model"
Make sure your LLM is serving OpenAI-compatible API endpoints.
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C
.
MIT License. See LICENSE file.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!