ernie-4.5-21b-a3b by openrouter - AI Model Details, Pricing, and Performance Metrics
ernie-4.5-21b-a3b
completionsernie-4.5-21b-a3b
A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.
A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.
Access ernie-4.5-21b-a3b through LangDB AI Gateway
Integrate with baidu's ernie-4.5-21b-a3b and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.
Free tier available • No credit card required
Category Scores
Benchmark Tests
| Metric | HLE | AIME | DROP | GPQA | MMLU | SciCode | MATH-500 | MMLU-Pro | LiveCodeBench | AA Math Index | AA Coding Index | AAII |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Score | 3.5 | 49.3 | 28.6 | 81.1 | 41.9 | 31.5 | 93.1 | 77.6 | 46.7 | 41.3 | 14.5 | 14.9 |
Compare with Similar Models
Code Examples
Integration samples and API usage
Related Models
Similar models from openrouter