magistral-small-2506
completions
Magistral Small is a 24B parameter instruction-tuned model based on Mistral-Small-3.1 (2503), enhanced through supervised fine-tuning on traces from Magistral Medium and further refined via reinforcement learning. It is optimized for reasoning and supports a wide multilingual range, including over 20 languages.
Input:$0.5 / 1M tokens
Output:$1.5 / 1M tokens
Context:40K tokens
tools
text
text
Access magistral-small-2506 through LangDB AI Gateway
Recommended
Integrate with mistralai's magistral-small-2506 and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.
Unified API
Cost Optimization
Enterprise Security
Get Started Now
Free tier available • No credit card required
Instant Setup
99.9% Uptime
10,000+Monthly Requests
Code Example
Configuration
Base URL
API Keys
Headers
Project ID in header
X-Run-Id
X-Thread-Id
Model Parameters
12 availablefrequency_penalty
-202
include_reasoning
max_tokens
presence_penalty
-201.999
response_format
seed
stop
structured_outputs
temperature
012
tool_choice
tools
top_p
011
Additional Configuration
Tools
Guards
User:
Id:
Name:
Tags:
Publicly Shared Threads0
Discover shared experiences
Shared threads will appear here, showcasing real-world applications and insights from the community. Check back soon for updates!
Share your threads to help others
Popular Models10