A JSON-RPC 2.0 compliant server that enables interaction with HDF5 data files and Slurm job scheduling through standardized API endpoints.
Name: Jafar Alzoubi Student ID: A20501723
pip install uv
uv venv
source .venv/bin/activate
uv sync
uvicorn src.server:app --reload
pytest tests/
mock_data/hdf5
curl -X POST "http://127.0.0.1:8000/mcp"
-H "Content-Type: application/json"
-d '{
"jsonrpc": "2.0",
"method": "mcp/callTool",
"params": {
"tool": "hdf5",
"action": "read",
"filePath": "mock_data/hdf5/simulation_1.h5",
"dataset": "temperature"
},
"id": 1
}'
curl -X POST "http://127.0.0.1:8000/mcp"
-H "Content-Type: application/json"
-d '{
"jsonrpc": "2.0",
"method": "mcp/callTool",
"params": {
"tool": "slurm",
"action": "submit",
"script": "analysis.sh",
"cores": 8
},
"id": 2
}'
pytest tests/
pytest tests/test_hdf5.py -v pytest tests/test_slurm.py -v
pytest --cov=src
project-root/ ├── mock_data/ │ ├── hdf5/ │ │ ├── simulation_1.h5 │ │ └── simulation_2.h5 │ └── slurm/ │ ├── job_scripts/ │ └── job_status.json
HDF5 Handler Uses h5py library for file operations
Mock data path: ./mock_data/hdf5/
Supported actions:
list: Recursive directory listing
read: Dataset retrieval with shape/dtype info
metadata: File-level metadata
Simulates job submission with subprocess
Mock features:
Generates UUID-based job IDs
Tracks job status in memory
Simulates queueing/running/completed states
Common Issues: lsof -i :8000 kill -9
uv pip install --force-reinstall -r requirements.txt
✅ Two capabilities implemented (HDF5 + Slurm)
✅ Full JSON-RPC 2.0 compliance
✅ 100% test coverage for both capabilities
✅ Proper error handling and responses
✅ Async request processing
Ran 13 tests in 0.42s OK
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!