A project that integrates Model Control Protocol with OpenAI's API, allowing OpenAI to access and utilize tools exposed by a dockerized MCP server.
This project demonstrates how to integrate the Model Control Protocol (MCP) with OpenAI's API, enabling OpenAI to access and use tools exposed by an MCP server running in Docker.
server.py
: The MCP server implementation with a toolclient.py
: A client that connects to the server and calls the agentDockerfile
: Instructions for building the Docker imagerequirements.txt
: Python dependencies for the projectdocker build -t mcp-server .
docker run -p 8050:8050 mcp-server
This will start the MCP server inside a Docker container and expose it on port 8050.
Once the server is running, you can run the client in a separate terminal:
python client.py
The client will connect to the server, list available tools, and call the agent to answer the query.
If you encounter connection issues:
Check if the server is running: Make sure the Docker container is running with docker ps
.
Verify port mapping: Ensure the port is correctly mapped with docker ps
or by checking the output of the docker run
command.
Check server logs: View the server logs with docker logs
to see if there are any errors.
Host binding: The server is configured to bind to 0.0.0.0
instead of 127.0.0.1
to make it accessible from outside the container. If you're still having issues, you might need to check your firewall settings.
Network issues: If you're running Docker on a remote machine, make sure the port is accessible from your client machine.
http://localhost:8050/sse
.Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!