The future is not about saving money, but about eating more

1 January, according to Shinjimoto, Young Professor, President of the National University of Singapore, Founder of Sunshine ScienceYuyangYesterday, an in-depth article on the bottlenecks in smart growth explored AI The current Scaling Law dilemma of the industry and the way forward。

The future is not about saving money, but about eating more

ACCORDING TO THE ARTICLE, THE TECHNICAL NATURE OF THE LARGE AI MODEL OVER THE LAST 10 YEARS IS TO TRANSFORM POWER ENERGY FROM COMPUTING TO REUSABLE INTELLIGENCE。

The core issues facing the industry are notHashrateDeplete, but instead, the existing mode of calculation - including model architecture, loss function (los) and "digestion" of optimized algorithms for algorithms is declining. This means that the methods available to us do not allow us to take full advantage of the calculus of sustained growth to achieve the same proportion of smart leaps。

This view echoes the recent judgement of the industry giants。

Ilya Sutskever has publicly indicated that a simple stack of pre-training calculus is entering the platform

Yann LeCun believes that the current large-language model path cannot reach the real AGI

OpenAI CEO Sam Altman has also implicitly acknowledged that GPU growth alone will not be able to return to the same proportion of smart-up。

The term "bottleneck" has been redefined in the U.S. text, emphasizing the need to distinguish between "efficiency improvement" and "intelligence upper limit."。

The former, such as cutting, distilling and low-precision training, are designed to achieve the same results with fewer resources and are essential for the project to land, while the latter focus on how to train more capable and broad-based models with the same floating point to calculate the total amount。

He mentioned that the GPU was essentially a parallel computer, and Transformer won precisely because its parallel computing properties perfected the GPU hardware architecture。

HOWEVER, WHEN COMPUTING CONTINUES TO GROW, THE REAL BOTTLENECK IS HOW TO "EAT" MORE ENERGY IN THE AI MODEL IN UNIT TIME AND TURN IT INTO A SMART ONE, RATHER THAN SIMPLY SEEKING TO SAVE IT。

With regard to the way forward, the article suggests several areas of concern:

EXPLORE HIGHER NUMERIC ACCURACY (E.G. FROM FP16 TO FP32 OR EVEN FP64)

(b) The introduction of a higher-level optimizer to provide better parameters for updating paths

Design a more expansive model structure or Los function, not with a single objective of throughput, but with models expressed at the limit of their calculus

More adequate training and over-parameter searches。

According to Yuyang’s summary, if the central question for the last 10 years is “how to get more calculus”, the next stage will be “how to really turn these algorithms into intelligence.”。

statement:The content of the source of public various media platforms, if the inclusion of the content violates your rights and interests, please contact the mailbox, this site will be the first time to deal with.
Information

Kimi, 500 million dollars. C Round, cash reserves over 10 billion

2026-1-1 14:07:48

Information

AI RESHAPING MANPOWER PATTERNS, MORGAN STANLEY EXPECTS THE EUROPEAN BANK TO DOWNSIZE 200,000 BY 2030

2026-1-2 18:23:01

Search