Home/Models/Qwen3.5-122B-A10B
Alibaba Qwen logo

Qwen3.5-122B-A10B

by Alibaba Qwen · Released Feb 2026

Open SourceMultimodal
Compare
Context
262K tokens (~131 books)
Input $/1M
$0.26
Output $/1M
$2.08
Type
multimodal
License
Open Source
Benchmarks
5 tested
Data updated today
About

The Qwen3.5 122B-A10B native vision-language model is built on a hybrid architecture that integrates a linear attention mechanism with a sparse mixture-of-experts model, achieving higher inference efficiency. In terms of...

Tested on 5 benchmarks with 0.0% average. Top scores: Chatbot Arena Elo — Overall (1416.8%), Chatbot Arena Elo — Coding (1362.3%), Artificial Analysis — Agentic Index (53.0%).

Capabilities
speed
71.8
#21 globally
Benchmark Scores
Compare All
Tested on 5 benchmarks · Ranked across 2 categories
Score Distribution (all 233 models)
0255075100
Chatbot Arena Elo — Overall

Chatbot Arena overall Elo rating. Crowdsourced human preference ranking from blind head-to-head comparisons across all topics.

1417
Chatbot Arena Elo — Coding

Chatbot Arena coding Elo. Human preference ranking specifically for coding tasks and technical questions.

1362
Artificial Analysis — Agentic Index

Artificial Analysis Agentic Index. Composite score measuring agent capability across tool use and planning tasks.

53.0
Artificial Analysis — Quality Index

Artificial Analysis Quality Index. Composite quality score combining multiple benchmark results into a single metric.

41.6
Artificial Analysis — Coding Index

Artificial Analysis Coding Index. Composite coding quality score from multiple code benchmarks.

34.7
Excellent (85+) Good (70-85) Average (50-70) Below (<50)
Links
Documentation
Community
BenchGecko API
qwen3-5-122b-a10b
Specifications
  • Typemultimodal
  • Context262K tokens (~131 books)
  • ReleasedFeb 2026
  • LicenseOpen Source
  • StatusActive
  • Cost / Message~$0.003
Available On
Alibaba Qwen logoAlibaba Qwen$0.26
Categories
Share & Export
Tweet
Qwen3.5-122B-A10B is an open-source multimodal AI model by Alibaba Qwen, released in February 2026. Context window: 262K tokens.