xAI
Model Categories
Pricing Range — $/1M input tokens
Open Source Ratio
All xAI Models14 total
| #▲ | Model | Avg | aider edit | aider poly? | ANLI? | APEX-Agents? | ARC AI2? | ARC-AGI? | ARC-AGI-2? | aa agentic? | aa coding ? | aa quality | seal audio | seal audio | seal audio | Balrog? | BBH? | hf bbh | C-Eval | CadEval? | charxiv re? | charxiv re? | arena elo | arena elo | chess puzz? | CMMLU | CSQA2 | Cybench? | deepresear? | EnigmaEval | fiction li? | Fortress | frontierma? | frontierma? | GeoBench? | GPQA | GPQA diamond? | graphwalks? | GSM8K? | GSO-Bench? | HellaSwag? | HELM · GPQA | helm ifeva | helm mmlu | helm omni | helm wildb | HLE? | hle tools | seal human | seal human | IFEval | jp jcommon | JHumanEval | JMMLU | JNLI | JSQuAD | LAMBADA? | lech mazur? | livebench | livebench | livebench | livebench | livebench | livebench | livebench | livebench | jp overall | MASK | MATH level 5? | MATH Level 5 | MCP Atlas | MMLU? | MMLU-PRO | MMMLU | mmmlu ar | mmmlu bn | mmmlu zh | mmmlu fr | mmmlu de | mmmlu hi | mmmlu id | mmmlu it | mmmlu ja | mmmlu ko | mmmlu pt | mmmlu es | mmmlu sw | mmmlu yo | seal multi | MultiNRC | MUSR | OpenBookQA? | oc aime202 | oc gpqa di | oc hle | oc ifeval | oc livecod | oc mmlu pr | OSWorld? | otis mock ? | PIQA? | posttrainb | seal pro r | seal pro r | seal prope | seal remot | ScienceQA? | SciPredict | SimpleBench? | simpleqa v? | seal swe a | seal swe a | swe bench | swe bench | swe bench | seal swe b | seal swe b | swe bench ? | swe bench ? | terminal b? | the agent ? | TriviaQA? | TutorBench | USAMO | VideoMME? | VISTA | seal visua | VPCT? | WeirdML? | Winogrande? | $/1M in | Context | Released |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | 67.9 | - | 53.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 65.0 | 88.4 | 78.8 | 46.4 | 84.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $3.00 | 131K | Apr 251y ago | |
| 2 | 62.2 | - | 79.6 | - | 15.2 | - | 66.7 | 16.0 | - | - | - | - | - | - | 43.6 | - | - | - | - | - | - | - | - | 28.0 | - | - | 43.0 | 47.9 | - | 94.4 | - | 19.7 | 2.1 | 45.0 | - | 82.7 | - | - | - | - | 72.6 | 94.9 | 85.1 | 60.3 | 79.7 | - | - | - | - | - | - | - | - | - | - | - | 80.7 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 84.0 | - | - | - | - | - | - | - | - | 52.6 | 47.9 | - | - | - | - | - | - | - | - | - | 27.2 | - | - | - | - | - | - | - | - | 45.7 | - | $3.00 | 256K | Jul 2510mo ago | |
| 3 | 53.1 | - | 49.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 1357.4 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 67.5 | 95.1 | 79.9 | 31.8 | 65.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | $0.30 | 131K | Apr 251y ago | |
| 4 | 51.6 | - | - | - | - | - | 48.5 | 5.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 30.0 | - | - | 94.4 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 81.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 42.9 | - | $0.20 | 2.0M | Sep 257mo ago | |
| 5 | 46.0 | - | 49.3 | - | - | - | 16.5 | 0.4 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 66.7 | - | 5.9 | - | - | - | 68.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 73.5 | - | - | - | - | - | - | - | - | - | - | 90.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 77.8 | - | - | - | - | - | - | - | - | - | 21.1 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 42.6 | - | $0.30 | 131K | Jun 2511mo ago | |
| 6 | 39.9 | - | 53.3 | - | - | - | 5.5 | 0.1 | - | - | - | - | - | - | 29.5 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 58.3 | - | 3.8 | 0.1 | - | - | 67.7 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 76.4 | - | - | - | - | - | - | - | - | - | - | 88.8 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 55.5 | - | - | - | - | - | - | - | - | 23.3 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 37.2 | - | $3.00 | 131K | Jun 2511mo ago | |
| 7 | 27.6 | 58.6 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 0.7 | - | - | - | 38.4 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 63.6 | - | - | - | - | - | - | - | - | - | - | 63.5 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 11.4 | - | - | - | - | - | - | - | - | 7.2 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | 22.2 | - | N/A | - | Jan 242y ago |
About xAI
Quick answers · sourced from our data
How many models does xAI have?
BenchGecko tracks 14 models from xAI. Every entry is updated daily from live provider feeds.
What is the best model from xAI?
Grok 3 Beta is currently the highest scoring xAI model we track, with an average benchmark score of 69.5. Scores are computed across every public benchmark we have data for.
What is the cheapest xAI model?
The cheapest xAI model on BenchGecko starts at $0.20 per 1M input tokens. Pricing is pulled from OpenRouter and cross-checked against official provider rate cards.
How does xAI compare on benchmarks?
xAI models average 39.7 across the benchmarks we track · see the All Providers page for the full ranking by model count, open source ratio, and average score.
Where is xAI based?
xAI is headquartered in United States. BenchGecko groups providers by region to make it easy to compare US, EU, China, and Rest of World markets.
Is xAI open source?
xAI currently ships no open source models on BenchGecko · all 14 tracked models are proprietary.