Qwen 3.5 397B-A17B
AlibabaLLMs
89.2
Performance
★ 4.5
Rating
156
Reviews
ReasoningLargeCode AssistantText GenerationMultimodalOpen Weight
About
Alibaba's open-weight flagship MoE model with 397B total parameters and 17B active, leading open models on many benchmarks.
Strengths
Leading open-weight model on vision (MMMU, MathVision) and instruction following (IFBench). Strong coding (SWE-bench Verified) and agentic tasks. Apache 2.0 license with support for 201 languages. MoE architecture keeps inference cost low relative to quality.
Specifications
- Context window
- 256,000
- Parameters
- 397B
Pricing
- Input cost
- Free
- Output cost
- Free
Open-weight (Apache 2.0). Free to download. ~$0.50-2.00/1M tokens on providers.
Speed & Latency
- 45
- tokens/sec
- 500ms
- time to first token
Available On
HuggingFaceTogether AIFireworks AIAlibaba Cloud
Features
function callingstreamingvisionjson modesystem messages
Performance Trend
Benchmark score trends over time for the top 5 benchmarks.
Loading history...
Benchmarks
Scores from various benchmark tests; higher is better.
| Test | Score | Percentile | Source |
|---|---|---|---|
| ARC-Challenge | 89.8 | — | huggingface |
| BigBench Hard | 84.0 | p95 | seed |
| Chatbot Arena ELO | 1197.0 | — | chatbot-arena |