mixtral-8x7b-instruct by deepinfra - AI Model Details, Pricing, and Performance Metrics
mixtral-8x7b-instruct
completionsMixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe
Access mixtral-8x7b-instruct through LangDB AI Gateway
Integrate with mistralai's mixtral-8x7b-instruct and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.
Free tier available • No credit card required
Category Scores
Benchmark Tests
| Metric | AIME  | AAII  | GPQA  | HLE  | LiveCodeBench  | MATH-500  | MMLU-Pro  | SciCode  | 
|---|---|---|---|---|---|---|---|---|
| Score | 0.0  | 2.6  | 29.2  | 4.5  | 6.6  | 29.9  | 38.7  | 2.8  | 
Compare with Similar Models
Code Examples
Integration samples and API usage
Related Models
Similar models from deepinfra