ZA
z-ai
🇨🇳China
11
Total des modèles
suivis sur BenchGecko
9
Code source ouvert
82% des modèles
$0.06
Modèle le moins cher
par 1M de tokens en entrée
31.0
Benchmark moyen
sur 5 modèles évalués
Catégories de modèles
LLM9Multimodal2
Fourchette de prix — $/1M de tokens en entrée
$0.06
$0.10
$0.13
$0.30
$0.39
$0.39
$0.60
$0.60
$0.72
$1.20
Bas: $0.06Médian: $0.39Haut: $1.20
Ratio code source ouvert
82%
9 code source ouvert2 propriétaire
Tous les modèles z-ai11 total
| #▲ | Model | Avg | ARC AI2? | BBH? | GSM8K? | HellaSwag? | LAMBADA? | MMLU? | GPQA diamond? | MATH level 5? | otis mock ? | WeirdML? | Winogrande? | SimpleBench? | aider poly? | lech mazur? | GSO-Bench? | fiction li? | swe bench ? | terminal b? | frontierma? | simpleqa v? | frontierma? | chess puzz? | APEX-Agents? | OSWorld? | ARC-AGI-2? | HLE? | TriviaQA? | ScienceQA? | PIQA? | OpenBookQA? | CadEval? | Balrog? | GeoBench? | Cybench? | ANLI? | the agent ? | VideoMME? | ARC-AGI? | deepresear? | VPCT? | $/1M in | Context | Released |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | ZA GLM 4.5🇨🇳 z-aiOpen | 44.1 | - | - | - | - | - | - | - | - | - | 40.6 | - | - | - | 78.0 | - | - | 54.2 | - | - | - | - | - | - | - | - | 3.7 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.60 | 131K | Jul 258mo ago |
| 2 | ZA GLM 5 Turbo🇨🇳 z-ai | 36.1 | - | - | - | - | - | - | 83.8 | - | 80.0 | 48.2 | - | 43.8 | - | - | - | - | - | - | 16.4 | - | 2.1 | 10.0 | - | - | 4.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $1.20 | 203K | Mar 2615d ago |
| 3 | ZA GLM 5🇨🇳 z-aiOpen | 36.1 | - | - | - | - | - | - | 83.8 | - | 80.0 | 48.2 | - | 43.8 | - | - | - | - | - | - | 16.4 | - | 2.1 | 10.0 | - | - | 4.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.72 | 80K | Feb 261mo ago |
| 4 | ZA GLM 4.7🇨🇳 z-aiOpen | 30.2 | - | - | - | - | - | - | 77.8 | - | 83.3 | - | - | 37.2 | - | - | - | - | - | - | 2.4 | 31.5 | 0.1 | 6.0 | 3.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.39 | 203K | Dec 253mo ago |
| 5 | ZA GLM 4.6🇨🇳 z-aiOpen | 8.4 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 24.5 | 3.8 | - | 2.1 | - | 3.0 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.39 | 205K | Sep 256mo ago |
90+ 80-89 70-79 60-69 <60Scores in % unless noted. Avg = unweighted mean across tested benchmarks.