Mistral
French AI lab shipping both open-weight (Mistral 7B, Mixtral 8x7B, Mixtral 8x22B) and closed (Mistral Large) models.
French AI lab shipping both open-weight (Mistral 7B, Mixtral 8x7B, Mixtral 8x22B) and closed (Mistral Large) models.
Basic
Mistral AI was founded in 2023 in Paris by former Meta and DeepMind researchers. Their first release, Mistral 7B, set a quality-per-size benchmark for open-weight models. Mixtral 8x7B (Dec 2023) introduced MoE to the open ecosystem. Mistral Large is their closed-API flagship. Mistral positions on EU data sovereignty and GDPR-native compliance · a competitive differentiator in European enterprise sales.
Deep
Mistral architecture: sliding-window attention, RoPE, Grouped-Query Attention. Mixtral uses MoE with 8 experts total, 2 active per token. Mistral Large and subsequent closed models scale further with undisclosed size. Mistral's training infra is primarily EU-based · with European data residency as a key feature. Mistral Small / Medium / Large tiers differentiate on size and price. The lab has raised $500M+ at a $14B+ valuation as of 2026, becoming Europe's AI national champion.
Expert
Mistral 7B architecture: 32 layers, 4096 hidden, sliding-window attention with 4K window + 8K total context. Mixtral 8x7B: 8 experts of 7B each, 2 active per token, 47B total / 13B active parameters. Mixtral 8x22B: same pattern scaled up. Mistral Large (closed) is believed to be larger MoE, undisclosed specifics. Training data: European-language emphasis, strong in French, German, Spanish, Italian. Distribution: direct API, Azure AI Studio, AWS Bedrock, OCI. EU data residency is default, not optional · legal positioning unique vs Western frontier labs.
Mistral's EU-native positioning is the European enterprise wedge. EU AI Act and GDPR favor Mistral over US labs for sovereignty-sensitive deployments.
Depending on why you're here
- ·Sliding-window attention, MoE (Mixtral), GQA
- ·Mixtral 8x7B was the open-weight MoE breakthrough
- ·Training on EU infra for data residency
- ·Mixtral 8x22B for self-hosted high-capability workloads
- ·Mistral Large via API for closed-quality
- ·Use Mistral if EU data residency is a hard requirement
- ·Europe's AI national champion · political/regulatory tailwinds
- ·$14B valuation lags Anthropic/OpenAI · compelling multiple
- ·GDPR + EU AI Act compliance is their moat
- ·French AI company
- ·Ships both free-download and paid-API models
- ·Default choice for European businesses that care about data privacy
Mistral's bet is on regulatory moat, not raw capability. In a world where EU AI Act compliance matters, that's a valid bet.