{"id":50850,"date":"2026-03-12T12:25:22","date_gmt":"2026-03-12T04:25:22","guid":{"rendered":"https:\/\/www.1ai.net\/?p=50850"},"modified":"2026-03-12T12:25:22","modified_gmt":"2026-03-12T04:25:22","slug":"%e8%8b%b1%e4%bc%9f%e8%be%be%e8%ae%a1%e5%88%92-5-%e5%b9%b4%e5%86%85%e6%8a%95-260-%e4%ba%bf%e7%be%8e%e5%85%83%e5%bc%80%e5%8f%91%e5%bc%80%e6%94%be%e6%9d%83%e9%87%8d-ai%e6%a8%a1%e5%9e%8b","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/50850.html","title":{"rendered":"THE NEW ZEALAND PLAN IS INVESTING $26 BILLION OVER FIVE YEARS IN THE DEVELOPMENT OF OPEN WEIGHT MODELS"},"content":{"rendered":"<p>On March 12, according to Connect<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%8b%b1%e4%bc%9f%e8%be%be\" title=\"Look at the article with the label\" target=\"_blank\" >Nvidia<\/a>IT IS PLANNED TO INVEST $26 BILLION OVER THE NEXT FIVE YEARS TO DEVELOP OPEN WEIGHT AI MODELS AND COMPLETE SYSTEMS COVERING MODEL DEVELOPMENT, COMPUTING INFRASTRUCTURE, TALENT AND ECOLOGY\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-50851\" title=\"3168f3b6j00tbrqxe001vd000u00gwm\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2026\/03\/3168f3b6j00tbrqxe001vd000u000gwm.jpg\" alt=\"3168f3b6j00tbrqxe001vd000u00gwm\" width=\"1080\" height=\"608\" \/><\/p>\n<p>According to the report, the route chosen by Ingweida is not entirely open-source, but rather a \"open weight\" strategy, i.e. open model parameters, without having to fully comply with open-source agreements. This model is between the closed system of OpenAI and the full open source of Meta Llama, and is more consistent with the need for transparency and customization\u3002<\/p>\n<p>In concrete progress, Weida has released its strongest open weight model to date, Nemotron 3 Super, with parameters of 1280 billion and in multiple benchmark tests, GPT-OSS that claim to exceed OpenAI. The company also disclosed that pre-training had been completed for a 550 billion parametric super-mode to validate the limits of the next generation of hardware structures\u3002<\/p>\n<p>By contrast, the core models of OpenAI, Anthropic and Google remain closed, providing only cloud access; Meta also suggests that future strategies for the opening of the source may be tightened\u3002<\/p>\n<p>ANALYSTS BELIEVE THAT IF THEY TAKE THEIR SHARE OF 101 TP3T IN THE BASE MODEL MARKET WHILE MAINTAINING THEIR HARDWARE ADVANTAGE, THEY ARE EXPECTED TO GENERATE AN ADDITIONAL $50 BILLION PER YEAR OVER THE NEXT THREE YEARS\u3002<\/p>","protected":false},"excerpt":{"rendered":"<p>According to March 12, Connect, the Young Wida project has invested $26 billion over the next five years to develop an open weight AI model and to build a complete system covering model development, computing infrastructure, talent and ecology. According to the report, the route chosen by Ingweida is not entirely open-source, but rather a \"open weight\" strategy, i.e. open model parameters, without having to fully comply with open-source agreements. This model is between the closed system of OpenAI and the full open source of Meta Llama, and is more consistent with the need for transparency and customization. In terms of concrete progress, Young Weida has released its strongest open weight model to date, Nemotron 3 Super, with parameters of 1280 billion and multiple bases<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[167,239],"collection":[],"class_list":["post-50850","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-239"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/50850","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=50850"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/50850\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=50850"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=50850"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=50850"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=50850"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}