{"id":31904,"date":"2025-03-30T19:32:54","date_gmt":"2025-03-30T11:32:54","guid":{"rendered":"https:\/\/www.1ai.net\/?p=31904"},"modified":"2025-03-30T19:32:54","modified_gmt":"2025-03-30T11:32:54","slug":"%e6%9d%8e%e5%bc%80%e5%a4%8d%ef%bc%9a%e9%9b%b6%e4%b8%80%e4%b8%87%e7%89%a9%e6%ad%a3%e5%9f%ba%e4%ba%8e-deepseek%ef%bc%8c%e6%89%93%e9%80%a0-ai-2-0-%e6%97%b6%e4%bb%a3%e7%9a%84-windows","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/31904.html","title":{"rendered":"Kai-Fu Lee: Zero-One Everything is Building Windows for the AI 2.0 Era Based on DeepSeek"},"content":{"rendered":"<p>March 30, 2012 - In the afternoon of March 30, the 2025 <a href=\"https:\/\/www.1ai.net\/en\/tag\/%e4%b8%ad%e5%85%b3%e6%9d%91%e8%ae%ba%e5%9d%9b\" title=\"[Sees articles with labels]\" target=\"_blank\" >Zhongguancun Forum<\/a>Annual Meeting.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e9%9b%b6%e4%b8%80%e4%b8%87%e7%89%a9\" title=\"[Sees articles with labels]\" target=\"_blank\" >Zero One Everything<\/a> CEO, Chairman of Innovation Works<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%9d%8e%e5%bc%80%e5%a4%8d\" title=\"[Sees articles with [Li Kai Ji] label]\" target=\"_blank\" >Kai-Fu Lee<\/a>said, \"The cost of inference for large models is rapidly declining at a rate of ten times less per year, which provides very important conditions for AI-First applications to explode.\"<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-31905\" title=\"71a3ca3dj00stxpdy009sd000fa00a8p\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/03\/71a3ca3dj00stxpdy009sd000fa00a8p.jpg\" alt=\"71a3ca3dj00stxpdy009sd000fa00a8p\" width=\"550\" height=\"368\" \/><\/p>\n<p>\"Models that didn't perform well enough two years ago are now good enough;<strong>Models that were too expensive to reason about two years ago are now 'cabbage prices'<\/strong>.\" According to Lee Kai-Fu, \"AI-First applications will soon blow up, and 2025 will be the year when AI-First applications explode and big models 'land as king'.\"<\/p>\n<p>He noted that Scaling Law is slowing down in the pre-training phase because the amount of data used for model training has hit a bottleneck, and there are objective constraints in terms of arithmetic power. The good news is that there is a new dawn in the industry.<strong>The Scaling Law is shifting from the pre-training phase to the reasoning phase, which is the slow-thinking mode<\/strong>. In the past, the Scaling Law of the pre-training phase meant that -- with more GPUs, more data -- models could get smarter, but its growth trend is currently slowing down. And the new Slow Thinking Scaling Law is -- the longer a model thinks, the better quality results it will produce.<\/p>\n<p>\"It seems that model performance under the Slow Thinking Scaling Law is growing very fast at the moment, and there is still a lot of room for growth.\" Kai-Fu Lee said, \"Combined with these new technological innovations, the process of model training now becomes more like training a 'liberal arts student' first, so that the model reads all the books, and then train it in the direction of a 'science student', so that the model be able to prove math problems and write code, and eventually get awesome 'arts and science' models.\"<\/p>\n<p>In Kai-Fu Lee's opinion, with enterprises and users now having gone through the market education of \"DeepSeek Moment\", the Chinese market has truly awakened, which has also cleared a major obstacle for the outbreak of AI-First applications in China. This year, one of the focuses of artificial intelligence should be: Make AI Work, so that the big model can really empower thousands of industries.<\/p>\n<p>According to him, there are still many stuck difficulties to overcome given DeepSeek's landing in enterprise productivity scenarios.<strong>Zero One Everything has made a strategic shift in the past few months and has fully embraced DeepSeek<\/strong>We've also been working on a number of new projects, and have devoted much of our efforts to transforming DeepSeek's high-quality pedestal models into customized solutions for enterprise DeepSeek deployments-an analogy to how everything is building Windows in the AI 2.0 era, and DeepSeek is the kernel that drives Windows.<\/p>","protected":false},"excerpt":{"rendered":"<p>In the afternoon of March 30th, at the annual meeting of the China-Kang Village Forum in 2025, CEO Lee, Chairman of the Innovative Workshop, repeated that \u201cthe cost of reasoning for a large model is declining rapidly at a rate of 10 times the annual rate, which provides a very important condition for the AI-First application outbreak\u201d. \u201cTwo years ago, models that did not perform well enough are now good enough; two years ago, too expensive models of reasoning are now `bucket prices'.\u201d In his opinion, \"AI-First applications will soon blow up wells, and 2025 will be the year of the outbreak of AI-First applications and of the Great Model `Downing King'.\" He noted that since the amount of data used in model training had already touched bottlenecks and there were objective constraints on the ability to calculate, pre-training<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[1446,1518,1519],"collection":[],"class_list":["post-31904","post","type-post","status-publish","format-standard","hentry","category-news","tag-1446","tag-1518","tag-1519"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/31904","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=31904"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/31904\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=31904"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=31904"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=31904"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=31904"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}