ernie-4.5-21b-a3b by openrouter - AI Model Details, Pricing, and Performance Metrics

baidu
ernie-4.5-21b-a3b
Try
baidu

ernie-4.5-21b-a3b

completions
byopenrouter

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

Released
Jun 25, 2025
Knowledge
Dec 27, 2024
License
Proprietary
Context
120K
Input
$0.07 / 1M tokens
Output
$0.28 / 1M tokens
Accepts: text
Returns: text

Access ernie-4.5-21b-a3b through LangDB AI Gateway

Recommended

Integrate with baidu's ernie-4.5-21b-a3b and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.

Unified API
Cost Optimization
Enterprise Security
Get Started Now

Free tier available • No credit card required

Instant Setup
99.9% Uptime
10,000+Monthly Requests

Category Scores

Benchmark Tests

View Other Benchmarks
HLE
3.5
General Knowledge
AIME
49.3
Mathematics
DROP
28.6
General Knowledge
GPQA
81.1
STEM (Physics, Chemistry, Biology)
MMLU
41.9
General Knowledge
SciCode
31.5
Scientific
MATH-500
93.1
Mathematics
MMLU-Pro
77.6
General Knowledge
LiveCodeBench
46.7
Programming
AA Math Index
41.3
Mathematics
AA Coding Index
14.5
Programming
AAII
14.9
General

Code Examples

Integration samples and API usage