Mistral on Wednesday released Mistral Medium 4, a 600-billion-parameter sparse mixture-of-experts model, under an Apache 2.0-compatible licence that imposes no usage restrictions on output and no revenue threshold for commercial use.
The model activates approximately 39 billion parameters per forward pass and was trained on roughly 18 trillion tokens, according to a technical report published alongside the weights. Public benchmark results place it within five points of GPT-5.1 on most reasoning suites and ahead on multilingual tasks, particularly French, Arabic, and Hindi.
Why the licence matters
Meta's Llama 4 family, the previous reference open-weights release, includes a 700-million-monthly-active-user threshold that triggers a separate commercial licence. Mistral's release has no equivalent.
Several enterprise customers told VirtueSig the looser terms made internal procurement materially simpler — a small but meaningful tailwind for European AI sovereignty arguments.