A new Chinese startup MiniMax enters the open source LLM race
Chinese AI startup MiniMax has released MiniMax-M1, a new open-source language model.
Key features of MiniMax-M1:
- Designed to outperform Deepseek’s R1.
- Reasoning-focused model with a context window of up to one million tokens and a “thinking” budget of up to 80,000 tokens.
- Uses an efficient reinforcement learning approach, making it leaner than other open-source options.
- Available for free under the Apache-2.0 license on Hugging Face.
In benchmarks, it outperforms other open models like DeepSeek-R1-0528 and Qwen3-235B-A22B. Its performance on the OpenAI MRCR test, which measures complex, multi-step reasoning across long texts, comes close to the leading closed model, Gemini 2.5 Pro.