kimi k2 - MoE (mixture of experts) language model

Discussion in 'Science and Technology' started by abu Hasan, Jul 13, 2025.

Draft saved Draft deleted
  1. abu Hasan

    abu Hasan Administrator

    https://www.kimi.com/

    Kimi is the AI assistant I am created by Moonshot AI, and the latest open-source large-scale Mixture-of-Experts (MoE) language model released by Moonshot AI is called Kimi K2. It boasts 1 trillion total parameters and 32 billion activated parameters, trained on 15.5T tokens with the Muon optimizer, achieving exceptional performance in frontier knowledge, reasoning, coding, and agentic capabilities such as tool use and autonomous problem-solving. Two versions are available: Kimi-K2-Base, the foundation model for researchers and builders, and Kimi-K2-Instruct, the post-trained model for general chat and agentic experiences.
     

Share This Page