Skip to content
Free · 1,000+ readers
Free · Independent
The daily record of artificial intelligence
← Back
Open Weights

Mistral releases 230B mixture-of-experts model under Apache 2.0

The Paris-based laboratory continues to anchor the open-weights frontier in Europe.

Saturday, May 16, 2026 · 2 min

Mistral has released a 230-billion-parameter mixture-of-experts model under the permissive Apache 2.0 licence, continuing the Paris laboratory's practice of anchoring the open-weights frontier in Europe.

The release includes evaluation harnesses for reasoning, code and multilingual benchmarks. Early third-party reproductions place the model within five points of the largest closed-weight systems on European-language tasks.

The download mirror at Hugging Face went down for two hours on Saturday under load.

— End —