Context
33K tokens (~16 books)
Input $/1M
$0.05
Output $/1M
$0.08
Type
text
License
Open Source
Benchmarks
1 tested
Data updated today
About
Mistral Small 3 is a 24B-parameter language model optimized for low-latency performance across common AI tasks. Released under the Apache 2.0 license, it features both pre-trained and instruction-tuned versions designed...
Tested on 1 benchmarks with 0.0% average. Top scores: Chatbot Arena Elo — Overall (1273.5%).
Benchmark Scores
Compare AllTested on 1 benchmarks · Ranked across 1 categories
Score Distribution (all 233 models)
0255075100
arenaCompare arena →
Chatbot Arena Elo — Overall
1273—Chatbot Arena overall Elo rating. Crowdsourced human preference ranking from blind head-to-head comparisons across all topics.
Excellent (85+) Good (70-85) Average (50-70) Below (<50)
Links
Research
Documentation
Community
Source Code
BenchGecko API
mistral-small-24b-instruct-2501
Specifications
- Typetext
- Context33K tokens (~16 books)
- ReleasedJan 2025
- LicenseOpen Source
- StatusActive
- Cost / Message~$0.000
Available On
Learn More
Share & Export
Frequently Asked Questions
Mistral Small 3 is an open-source text AI model by Mistral AI, released in January 2025. Context window: 33K tokens.