Linear-MoE: Linear Sequence Modeling Meets Mixture-of-ExpertsMar 7, 2025·Weigao Sun,Disen Lan,Tong Zhu,Xiaoye Qu,Yu Cheng· 0 min read PDF Cite CodeLast updated on Mar 7, 2025 AuthorsWeigao SunYoung Scientist ← A Survey of Efficient Reasoning for Large Reasoning Models: Language, Multimodality, and Beyond Mar 27, 2025Liger: Linearizing Large Language Models to Gated Recurrent Structures Mar 3, 2025 →