{"id":45456,"date":"2025-10-31T12:13:54","date_gmt":"2025-10-31T04:13:54","guid":{"rendered":"https:\/\/www.1ai.net\/?p=45456"},"modified":"2025-10-31T12:13:54","modified_gmt":"2025-10-31T04:13:54","slug":"%e5%89%8d%e9%98%bf%e9%87%8c%e3%80%81%e5%ad%97%e8%8a%82%e5%a4%a7%e6%a8%a1%e5%9e%8b%e8%b4%9f%e8%b4%a3%e4%ba%ba%ef%bc%9aagi-%e4%b8%8d%e5%ba%94%e6%98%af%e7%ae%97%e5%8a%9b%e7%ab%9e%e8%b5%9b%ef%bc%8c","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/45456.html","title":{"rendered":"FORMER ALI, BIG BYTE MODEL LEADER: AGI SHOULD NOT BE A MATH CONTEST, BUT A \"ALL FOR ALL.\""},"content":{"rendered":"<p>On October 31st, according to Smart Episode, ex-Ali, byte model manager<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%9d%a8%e7%ba%a2%e9%9c%9e\" title=\"[Sees articles with tags]\" target=\"_blank\" >Yang Hongxia<\/a>After leaving bytes, a new company was created in July 2024 <a href=\"https:\/\/www.1ai.net\/en\/tag\/infix\" title=\"[See articles with [InfiX] labels]\" target=\"_blank\" >InfiX<\/a>.ai and yesterday in hong kong, the latest developments were revealed\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-45457\" title=\"7fe8d084j00t4zadg000yd000p000mxm\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/10\/7fe8d084j00t4zadg000yd000p000mxm.jpg\" alt=\"7fe8d084j00t4zadg000yd000p000mxm\" width=\"900\" height=\"825\" \/><\/p>\n<p>Yang Hongxia suggested that large model pre-training should not be a calculator competition for a small number of giants, but should involve small and medium-sized enterprises, research institutions and even individuals through \" decentrization \" \u3002<\/p>\n<p>YANG HONGXIA LED THE DEVELOPMENT OF A LARGE M6 MODEL AT THE ARIDA MO COMPOUND, AND THEN CONTINUED TO WORK DEEP IN THE DIRECTION OF THE LARGE BYTE\u3002<\/p>\n<p><strong>She noted that, while the existing \"centreization\" model could provide a technological breakthrough, there were limitations in the application of landing sites, especially in the context of data-sensitive and localized deployments, where post-training was difficult to fill the knowledge gap at the pre-training stage, leading to a high incidence of hallucinations\u3002<\/strong><\/p>\n<p>Core technical pathways for InfiX.ai include:<\/p>\n<p>InfiR2 FP8: A maximum training rate of 22% and a reduction in visible storage consumption of 14% with little loss of performance<\/p>\n<p>(a) Model integration technology InfiFusion: by integrating \"expert models\" in different areas, avoiding duplication of training and increasing knowledge coverage<\/p>\n<p>InfiMed: Demonstrated the reasoning capacity beyond the same model in complex medical missions such as cancer<\/p>\n<p>Multi-Intelligent: Automatically decomposed complex tasks and reduced development costs\u3002<\/p>\n<p>Yang Hongxia emphasized that in the future each enterprise would have its own large field model and that cross-cutting, cross-regional integration of knowledge through model integration would form the basis of globalization. She thinks it's universal artificial intelligence<a href=\"https:\/\/www.1ai.net\/en\/tag\/agi\" title=\"_OTHER ORGANISER\" target=\"_blank\" >AGI<\/a>It should not be limited to top-level institutions, but will evolve into \u201call-people collaboration\u201d\u3002<\/p>","protected":false},"excerpt":{"rendered":"<p>On October 31st, according to Smart Episode, Yang Hongxia, the former director of the byte model, after leaving the byte beat, founded the new company InfiX.ai in July 2024 and released the latest developments in Hong Kong yesterday. Yang Hongxia suggested that large model pre-training should not be a calculator competition for a few giants, but should involve small and medium-sized enterprises, research institutions and even individuals through \" decentrization \" . Yang Hongxia led the development of a large M6 model at the Arida Mo compound, and then continued to work deep in the direction of the large byte. She noted that the existing \"centreization\" model, while providing technological breakthroughs, had limitations in the application of the site, particularly in the context of data-sensitive and localized deployments, where post-training was difficult to fill the knowledge gap at the pre-training stage, leading to hallucinations<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[151,7811,7812],"collection":[],"class_list":["post-45456","post","type-post","status-publish","format-standard","hentry","category-news","tag-agi","tag-infix","tag-7812"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/45456","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=45456"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/45456\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=45456"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=45456"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=45456"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=45456"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}