New Lightning Attention Architecture LLM // 1st production model

Up to 4M context // open source

sbagency
2 min readJan 15, 2025
https://x.com/Hailuo_AI/status/1879229798649856343
https://www.minimaxi.com/en/news/minimax-01-series-2
https://huggingface.co/MiniMaxAI/MiniMax-Text-01
https://filecdn.minimax.chat/_Arxiv_MiniMax_01_Report.pdf

We introduce MiniMax-01 series, including MiniMax-Text-01 and MiniMax-VL-01, which are comparable to top-tier models while offering superior capabilities in processing longer contexts. The core lies in lightning attention and its efficient scaling. To maximize computational capacity, we integrate it with Mixture of Experts (MoE), creating a model with 32 experts and 456 billion total parameters, of which 45.9 billion are activated for each token. We develop an optimized parallel strategy and highly efficient computation-communication overlap techniques for MoE and lightning attention. This approach enables us to conduct efficient training and inference on models with hundreds of billions of parameters across contexts spanning millions of tokens. The context window of MiniMax-Text-01 can reach up to 1 million tokens during training and extrapolate to 4 million tokens during inference at an affordable cost. Our vision-language model, MiniMax-VL-01 is built through continued training with 512 billion vision-language tokens. Experiments on both standard and in-house benchmarks show that our models match the performance of state-of-the-art models like GPT-4o and Claude-3.5-Sonnet while offering a 20–32 times longer context window. We publicly release MiniMax-01 at https://github.com/MiniMax-AI.

Figure 1 | Benchmark performance. (a) MiniMax-Text-01 on core text benchmarks. (b) MiniMaxVL-01 on core multimodal benchmarks. © MiniMax-Text-01 on the long-context RULER (Hsieh et al., 2024) benchmark. The performance of leading commercial and open-source models is presented for reference.
https://x.com/minchoi/status/1879262618608656805

--

--

sbagency
sbagency

Written by sbagency

Tech/biz consulting, analytics, research for founders, startups, corps and govs.

No responses yet