Mistral has released a 230-billion-parameter mixture-of-experts model under the permissive Apache 2.0 licence, continuing the Paris laboratory's practice of anchoring the open-weights frontier in Europe.
The release includes evaluation harnesses for reasoning, code and multilingual benchmarks. Early third-party reproductions place the model within five points of the largest closed-weight systems on European-language tasks.
The download mirror at Hugging Face went down for two hours on Saturday under load.