On March 12, according to ConnectNvidiaIT IS PLANNED TO INVEST $26 BILLION OVER THE NEXT FIVE YEARS TO DEVELOP OPEN WEIGHT AI MODELS AND COMPLETE SYSTEMS COVERING MODEL DEVELOPMENT, COMPUTING INFRASTRUCTURE, TALENT AND ECOLOGY。

According to the report, the route chosen by Ingweida is not entirely open-source, but rather a "open weight" strategy, i.e. open model parameters, without having to fully comply with open-source agreements. This model is between the closed system of OpenAI and the full open source of Meta Llama, and is more consistent with the need for transparency and customization。
In concrete progress, Weida has released its strongest open weight model to date, Nemotron 3 Super, with parameters of 1280 billion and in multiple benchmark tests, GPT-OSS that claim to exceed OpenAI. The company also disclosed that pre-training had been completed for a 550 billion parametric super-mode to validate the limits of the next generation of hardware structures。
By contrast, the core models of OpenAI, Anthropic and Google remain closed, providing only cloud access; Meta also suggests that future strategies for the opening of the source may be tightened。
ANALYSTS BELIEVE THAT IF THEY TAKE THEIR SHARE OF 101 TP3T IN THE BASE MODEL MARKET WHILE MAINTAINING THEIR HARDWARE ADVANTAGE, THEY ARE EXPECTED TO GENERATE AN ADDITIONAL $50 BILLION PER YEAR OVER THE NEXT THREE YEARS。