dolphin-mixtral-8x22b by openrouter - AI Model Details, Pricing, and Performance Metrics

cognitivecomputations
dolphin-mixtral-8x22b
cognitivecomputations

dolphin-mixtral-8x22b

completions
byopenrouter

Dolphin 2.9 is designed for instruction following, conversational, and coding. This model is a finetune of [Mixtral 8x22B Instruct](/models/mistralai/mixtral-8x22b-instruct). It features a 64k context length and was fine-tuned with a 16k sequence length using ChatML templates. This model is a successor to [Dolphin Mixtral 8x7B](/models/cognitivecomputations/dolphin-mixtral-8x7b). The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at [erichartford.com/uncensored-models](https://erichartford.com/uncensored-models). #moe #uncensored

Context
16K
Input
$0.9 / 1M tokens
Output
$0.9 / 1M tokens
Accepts: text
Returns: text

Access dolphin-mixtral-8x22b through LangDB AI Gateway

Recommended

Integrate with cognitivecomputations's dolphin-mixtral-8x22b and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.

Unified API
Cost Optimization
Enterprise Security
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests

Code Examples

Integration samples and API usage