Skip to content
Thoughts, stories and ideas.
Open Source

Mistral open-sources 600B-parameter mixture-of-experts model

The release sits between the open-weights giants and frontier closed models. The licensing fine print is the most permissive in its weight class.

Mistral open-sources 600B-parameter mixture-of-experts model

Mistral on Wednesday released Mistral Medium 4, a 600-billion-parameter sparse mixture-of-experts model, under an Apache 2.0-compatible licence that imposes no usage restrictions on output and no revenue threshold for commercial use.

The model activates approximately 39 billion parameters per forward pass and was trained on roughly 18 trillion tokens, according to a technical report published alongside the weights. Public benchmark results place it within five points of GPT-5.1 on most reasoning suites and ahead on multilingual tasks, particularly French, Arabic, and Hindi.

Why the licence matters

Meta's Llama 4 family, the previous reference open-weights release, includes a 700-million-monthly-active-user threshold that triggers a separate commercial licence. Mistral's release has no equivalent.

Several enterprise customers told VirtueSig the looser terms made internal procurement materially simpler — a small but meaningful tailwind for European AI sovereignty arguments.

Read Next