A server that connects Unity with local large language models through Ollama, enabling developers to automate workflows, manipulate assets, and control the Unity Editor programmatically without relying on cloud-based LLMs.
A Unity MCP (Model Context Protocol) package that enables seamless communication between Unity and local Large Language Models (LLMs) via Ollama. This package extends justinpbarnett/unity-mcp to work with local LLMs, allowing developers to automate workflows, manipulate assets, and control the Unity Editor programmatically without relying on cloud-based LLMs.
The Unity MCP with Ollama Integration provides a bidirectional communication channel between:
This enables:
All powered by your own local LLMs, with no need for an internet connection or API keys.
This implementation is specifically configured to work with the following Ollama models:
You can easily switch between these models in the Unity MCP window.
Due to Unity's package manager compatibility issues, we recommend using the Asset Method for installation.
ollama pull deepseek-r1:14b
ollama pull gemma3:12b
Download or clone this repository:
git clone https://github.com/ZundamonnoVRChatkaisetu/unity-mcp-ollama.git
Create a folder in your Unity project's Assets directory:
Assets/UnityMCPOllama
Copy the Editor
folder from the cloned repository to your Unity project:
# Copy the entire Editor folder
[Repository]/Editor → Assets/UnityMCPOllama/Editor
Verify the folder structure is correct:
Assets/
UnityMCPOllama/
Editor/
MCPEditorWindow.cs
UnityMCPBridge.cs
Let Unity import and compile the scripts
Create a folder for the Python environment (outside your Unity project):
mkdir PythonMCP
cd PythonMCP
Copy the Python folder from the cloned repository:
cp -r [Repository]/Python .
Create and activate a virtual environment:
# Create a virtual environment python -m venv venv # Activate the virtual environment # On Windows: venv\Scripts\activate # On macOS/Linux: source venv/bin/activate
Install dependencies:
cd Python pip install -e .
ollama pull deepseek-r1:14b ollama pull gemma3:12b
ollama serve
Window > Unity MCP
to open the MCP windowcd PythonMCP
# On Windows: venv\Scripts\activate # On macOS/Linux: source venv/bin/activate
cd Python python server.py
deepseek-r1:14b
or gemma3:12b
Example prompts:
The Unity MCP window provides status information for each component:
Python Server Status: Indicates whether the Python server is running
Unity Bridge Status: Shows if the Unity socket server is running
Ollama Status: Shows the connection status to Ollama
"Not Connected" Status for Python Server
python server.py
)Cannot find Unity MCP menu
Ollama Connection Issues
ollama serve
MCP Command Execution Fails
If you encounter issues setting up the Python environment:
mkdir C:\PythonMCP
cd C:\PythonMCP
git clone https://github.com/ZundamonnoVRChatkaisetu/unity-mcp-ollama.git
copy unity-mcp-ollama\Python .
python -m venv venv
venv\Scripts\activate
cd Python
pip install -e .
python server.py
Local LLM performance depends on your hardware:
Contributions are welcome! Please feel free to submit a Pull Request or open an Issue.
This project is licensed under the MIT License.
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!