July 13, 2012 - U.S. AI startup Perplexity 's CEO, Aravind Srinivas, posted today thatDark Side of the Moon The Kimi K2 model performed well in testing and the company may follow up with post-training based on K2.

Live Mint reported in January that DeepSeek R1 had previously been used by Perplexity for model training.
Kimi K2 is Dark Side of the Moon Kimi's first trillion-parameter open-source model released just yesterday, emphasizing code power and general-purpose Agent task capabilities. It is a base model of the MoE architecture that is more specialized in generic Agent tasks, with 1T total parameters and 32B activation parameters.
1AI was officially informed by Dark Side of the Moon that Kimi K2 has achieved SOTA scores in open-source models in benchmark performance tests such as SWE Bench Verified, Tau2, and AceBench, demonstrating leading capabilities in code, Agent, and mathematical reasoning tasks.