Context
128K tokens (~64 books)
Input $/1M
Free
Output $/1M
Free
Type
multimodal
License
Open Source
Benchmarks
0 tested
Data updated today
About
Mistral Small 3.1 24B Instruct is an upgraded variant of Mistral Small 3 (2501), featuring 24 billion parameters with advanced multimodal capabilities. It provides state-of-the-art performance in text-based reasoning and vision tasks, including image analysis, programming, mathematical reasoning, and multilingual support across dozens of languages. Equipped with an extensive 128k token context window and optimized for efficient local inference, it supports use cases such as conversational agents, function calling, long-document comprehension, and privacy-sensitive deployments. The updated version is [Mistral Small 3.2](mistralai/mistral-small-3.2-24b-instruct)
No benchmark data available yet.
Links
Research
Documentation
Community
Source Code
BenchGecko API
mistral-small-3-1-24b-instruct-free
Specifications
- Typemultimodal
- Context128K tokens (~64 books)
- ReleasedMar 2025
- LicenseOpen Source
- StatusActive
- Cost / Message~$0.000
Available On
Share & Export
Frequently Asked Questions
Mistral Small 3.1 24B (free) is an open-source multimodal AI model by Mistral AI, released in March 2025. Context window: 128K tokens.