hunyuan-a13b-instruct by openrouter - AI Model Details, Pricing, and Performance Metrics
hunyuan-a13b-instruct
completionshunyuan-a13b-instruct
Hunyuan-A13B is a 13B active parameter Mixture-of-Experts (MoE) language model developed by Tencent, with a total parameter count of 80B and support for reasoning via Chain-of-Thought. It offers competitive benchmark performance across mathematics, science, coding, and multi-turn reasoning tasks, while maintaining high inference efficiency via Grouped Query Attention (GQA) and quantization support (FP8, GPTQ, etc.).
Hunyuan-A13B is a 13B active parameter Mixture-of-Experts (MoE) language model developed by Tencent, with a total parameter count of 80B and support for reasoning via Chain-of-Thought. It offers competitive benchmark performance across mathematics, science, coding, and multi-turn reasoning tasks, while maintaining high inference efficiency via Grouped Query Attention (GQA) and quantization support (FP8, GPTQ, etc.).
Access hunyuan-a13b-instruct through LangDB AI Gateway
Integrate with tencent's hunyuan-a13b-instruct and 250+ other models through a unified API. Monitor usage, control costs, and enhance security.
Free tier available • No credit card required
Code Examples
Integration samples and API usage
Related Models
Similar models from openrouter