Perplexity CEO Praises Dark Side of the Moon, Wants Kimi K2-Based Post-Training

July 13, 2012 - U.S. AI startup Perplexity 's CEO, Aravind Srinivas, posted today thatDark Side of the Moon The Kimi K2 model performed well in testing and the company may follow up with post-training based on K2.

Perplexity CEO Praises Dark Side of the Moon, Wants Kimi K2-Based Post-Training

Live Mint reported in January that DeepSeek R1 had previously been used by Perplexity for model training.

Kimi K2 is Dark Side of the Moon Kimi's first trillion-parameter open-source model released just yesterday, emphasizing code power and general-purpose Agent task capabilities. It is a base model of the MoE architecture that is more specialized in generic Agent tasks, with 1T total parameters and 32B activation parameters.

1AI was officially informed by Dark Side of the Moon that Kimi K2 has achieved SOTA scores in open-source models in benchmark performance tests such as SWE Bench Verified, Tau2, and AceBench, demonstrating leading capabilities in code, Agent, and mathematical reasoning tasks.

statement:The content of the source of public various media platforms, if the inclusion of the content violates your rights and interests, please contact the mailbox, this site will be the first time to deal with.
Information

Figure AI Founder Brett Adcock: Humanoid Robots Will Soon Equal Humans in Number

2025-7-13 13:16:32

Information

Grok chatbot praises Hitler after code error, Musk's xAI company urgently apologizes

2025-7-13 18:08:03

Search