z-ai
Model Categories
Pricing Range — $/1M input tokens
Open Source Ratio
All z-ai Models13 total
| #▲ | Model | Avg | aider edit | aider poly? | ANLI? | APEX-Agents? | ARC AI2? | ARC-AGI? | ARC-AGI-2? | aa agentic? | aa coding ? | aa quality | seal audio | seal audio | seal audio | Balrog? | BBH? | hf bbh | C-Eval | CadEval? | charxiv re? | charxiv re? | arena elo | arena elo | chess puzz? | CMMLU | CSQA2 | Cybench? | deepresear? | EnigmaEval | fiction li? | Fortress | frontierma? | frontierma? | GeoBench? | GPQA | GPQA diamond? | graphwalks? | GSM8K? | GSO-Bench? | HellaSwag? | HELM · GPQA | helm ifeva | helm mmlu | helm omni | helm wildb | HLE? | hle tools | seal human | seal human | IFEval | jp jcommon | JHumanEval | JMMLU | JNLI | JSQuAD | LAMBADA? | lech mazur? | livebench | livebench | livebench | livebench | livebench | livebench | livebench | livebench | jp overall | MASK | MATH level 5? | MATH Level 5 | MCP Atlas | MMLU? | MMLU-PRO | MMMLU | mmmlu ar | mmmlu bn | mmmlu zh | mmmlu fr | mmmlu de | mmmlu hi | mmmlu id | mmmlu it | mmmlu ja | mmmlu ko | mmmlu pt | mmmlu es | mmmlu sw | mmmlu yo | seal multi | MultiNRC | MUSR | OpenBookQA? | oc aime202 | oc gpqa di | oc hle | oc ifeval | oc livecod | oc mmlu pr | OSWorld? | otis mock ? | PIQA? | posttrainb | seal pro r | seal pro r | seal prope | seal remot | ScienceQA? | SciPredict | SimpleBench? | simpleqa v? | seal swe a | seal swe a | swe bench | swe bench | swe bench | seal swe b | seal swe b | swe bench ? | swe bench ? | terminal b? | the agent ? | TriviaQA? | TutorBench | USAMO | VideoMME? | VISTA | seal visua | VPCT? | WeirdML? | Winogrande? | $/1M in | Context | Released |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 87.0 | - | - | - | - | - | - | - | 67.0 | 43.4 | 51.4 | - | - | - | - | - | - | - | - | - | - | - | 1467.4 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 55.0 | 75.4 | 63.2 | 68.5 | 71.8 | 84.9 | 70.2 | 72.5 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $1.05 | 203K | Apr 261mo ago | |
| 2 | 72.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 1410.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 85.8 | 79.5 | 16.9 | 85.4 | 65.0 | 82.7 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.60 | 131K | Jul 259mo ago | |
| 3 | 69.5 | - | - | - | - | - | 44.7 | 4.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | 1441.0 | 1455.6 | 10.0 | - | - | - | - | - | - | - | 16.4 | 2.1 | - | - | 83.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 55.0 | 73.6 | 67.9 | 55.3 | 77.5 | 83.5 | 68.8 | 69.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 95.8 | 85.3 | 28.1 | 93.2 | 86.2 | 85.2 | - | 80.0 | - | 13.9 | - | - | - | - | - | - | 43.8 | - | - | - | - | - | - | - | - | 72.1 | - | 52.4 | - | - | - | - | - | - | - | - | 48.2 | - | $0.60 | 203K | Feb 263mo ago | |
| 4 | 56.6 | - | - | - | 3.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 1439.2 | 1442.7 | 6.0 | - | - | - | - | - | - | - | 2.4 | 0.1 | - | - | 77.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 41.7 | 73.1 | 55.2 | 35.7 | 65.2 | 76.0 | 58.1 | 59.7 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 95.4 | 86.9 | 25.4 | 90.2 | 83.8 | 84.0 | - | 83.3 | - | 7.5 | - | - | - | - | - | - | 37.2 | 31.5 | - | - | - | - | - | - | - | - | - | 33.4 | - | - | - | - | - | - | - | - | - | - | $0.38 | 203K | Dec 254mo ago | |
| 5 | 54.4 | - | - | - | 3.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 1353.7 | 1425.8 | - | - | - | - | - | - | - | - | 3.8 | 2.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 35.0 | 71.0 | 52.0 | 26.2 | 59.0 | 81.1 | 55.2 | 62.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 90.3 | 80.4 | 19.3 | 88.7 | 78.2 | 83.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 24.5 | - | - | - | - | - | - | - | - | - | - | $0.39 | 205K | Sep 257mo ago | |
| 6 | 47.8 | - | - | - | - | - | - | - | 61.1 | 36.2 | 42.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 3.3 | 73.9 | 54.1 | 27.2 | 62.3 | 70.4 | 49.6 | 56.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $1.20 | 203K | Apr 261mo ago | |
| 7 | 37.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 35.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 8.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 14.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 0.0 | - | - | 34.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 14.2 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.10 | 128K | Jul 259mo ago | |
| 8 | 28.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 1377.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 3.3 | 64.2 | 46.4 | 17.1 | 49.7 | 62.5 | 40.1 | 37.2 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.30 | 131K | Dec 255mo ago |
About z-ai
Quick answers · sourced from our data
How many models does z-ai have?
BenchGecko tracks 13 models from z-ai, of which 10 (77%) are open source. Every entry is updated daily from live provider feeds.
What is the best model from z-ai?
GLM 5.1 is currently the highest scoring z-ai model we track, with an average benchmark score of 70.2. Scores are computed across every public benchmark we have data for.
What is the cheapest z-ai model?
The cheapest z-ai model on BenchGecko starts at $0.06 per 1M input tokens. Pricing is pulled from OpenRouter and cross-checked against official provider rate cards.
How does z-ai compare on benchmarks?
z-ai models average 33.8 across the benchmarks we track · see the All Providers page for the full ranking by model count, open source ratio, and average score.
Where is z-ai based?
z-ai is headquartered in China. BenchGecko groups providers by region to make it easy to compare US, EU, China, and Rest of World markets.
Is z-ai open source?
10 of 13 z-ai models are open source (77%). The rest are proprietary · closed weights served via API.