mixtral-8x22b-instruct by fireworksai - AI Model Details, Pricing, and Performance Metrics

mistralai
mixtral-8x22b-instruct
mistralai

mixtral-8x22b-instruct

completions
On:fireworksaimistralai

Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe

ProviderInputOutput
fireworksai
fireworksai
$1.2 / 1M tokens$1.2 / 1M tokens
mistralai
mistralai
$2 / 1M tokens$6 / 1M tokens
Released
Apr 17, 2024
Knowledge
Oct 20, 2023
Context
65536
Input
$1.2 / 1M tokens
Output
$1.2 / 1M tokens
Capabilities: tools
Accepts: text
Returns: text

Access mixtral-8x22b-instruct through LangDB AI Gateway

Recommended

Integrate with mistralai's mixtral-8x22b-instruct and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.

Unified API
Cost Optimization
Enterprise Security
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests
Available from 2 providers
Provider:

Code Examples

Integration samples and API usage