mixtral-8x22b-instruct by fireworksai - AI Model Details, Pricing, and Performance Metrics
mixtral-8x22b-instruct
completionsmixtral-8x22b-instruct
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
Provider | Input | Output |
---|---|---|
$1.2 / 1M tokens | $1.2 / 1M tokens | |
$2 / 1M tokens | $6 / 1M tokens |
Providers | Context | Input Price | Output Price | Input Formats | Output Formats | License |
---|---|---|---|---|---|---|
65536 tokens | $1.2 / 1M tokens | $1.2 / 1M tokens | text | text | ||
65536 tokens | $2 / 1M tokens | $6 / 1M tokens | text | text | Apache-2.0 |
Access mixtral-8x22b-instruct through LangDB AI Gateway
Integrate with mistralai's mixtral-8x22b-instruct and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.
Free tier available • No credit card required
Code Examples
Integration samples and API usage
Related Models
Similar models from fireworksai