{"id":31446,"date":"2025-03-24T19:12:29","date_gmt":"2025-03-24T11:12:29","guid":{"rendered":"https:\/\/www.1ai.net\/?p=31446"},"modified":"2025-03-24T19:12:29","modified_gmt":"2025-03-24T11:12:29","slug":"%e6%b6%88%e6%81%af%e7%a7%b0%e8%9a%82%e8%9a%81%e9%9b%86%e5%9b%a2%e9%87%87%e7%94%a8%e9%98%bf%e9%87%8c%e3%80%81%e5%8d%8e%e4%b8%ba%e7%ad%89%e5%9b%bd%e4%ba%a7%e8%8a%af%e7%89%87%e8%ae%ad%e7%bb%83-ai","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/31446.html","title":{"rendered":"Ant Group adopts Ali, Huawei and other domestic chips to train AI: performance rivals NVIDIA H800, cost reduction 20%"},"content":{"rendered":"<p>March 24 (Bloomberg) -- According to Bloomberg today, people familiar with the matter revealed that<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%9a%82%e8%9a%81%e9%9b%86%e5%9b%a2\" title=\"[Sees articles with labels]\" target=\"_blank\" >Ant Group<\/a>developing AI model training techniques using chips made in China.<strong>This will result in lower costs 20%<\/strong>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-31447\" title=\"7cc781b1j00stmkft001vd000p100isp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/03\/7cc781b1j00stmkft001vd000p100isp.jpg\" alt=\"7cc781b1j00stmkft001vd000p100isp\" width=\"901\" height=\"676\" \/><\/p>\n<p>Ant Group used domestic chips including Alibaba Group Holding Ltd. and Huawei Technologies Co. to employ a mixed expert model (note: MoE, Mixture of experts) machine learning approach, the report said.<\/p>\n<p>The report also mentions that<strong>The results of the training were similar to the<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%8b%b1%e4%bc%9f%e8%be%be\" title=\"Look at the article with the label\" target=\"_blank\" >Nvidia<\/a>Company H800 Chip Match<\/strong>. One of the people familiar with the matter revealed that Ant Group is still using Nvidia chips for AI development, but now relies heavily on alternatives including AMD and domestic chips.<\/p>\n<p>The report suggests that this highlights Chinese companies' attempts to use local chips as an alternative to state-of-the-art NVIDIA semiconductors. Ant Group released a study this month that said its model outperformed Meta in some benchmarks, and if it works as expected, Ant's platform could be another new step forward for AI development in China.<\/p>\n<p>With major companies investing a lot of money into AI, MoE models have become a popular option, gaining recognition by being adopted by companies like Google and DeepSeek. The technique divides tasks into smaller datasets, much like having a team of experts focused on one part of the job, making the process more efficient. Ant Group declined to comment in an emailed statement.<\/p>\n<p>The report also mentions that Ant Group has been working on ways to train LLMs more efficiently and eliminate high-performance chip limitations. The title of its paper clearly indicates that the company has set a goal of \"<strong>No high-end GPUs<\/strong>\"Extended Modeling.<\/p>\n<p>This is contrary to NVIDIA's philosophy, where CEO Jen-Hsun Huang believes that compute demand will grow even as more efficient models like the DeepSeek R1 come out, and he believes that major companies need better chips to generate more revenue, not cheaper chips to cut costs. He is sticking with the strategy of building larger GPUs with more processing cores, transistors.<\/p>","protected":false},"excerpt":{"rendered":"<p>March 24 (Bloomberg) -- Ant Group is using Chinese-made chips to develop AI model training technology, which will reduce costs by 20%, according to Bloomberg today. Ant Group is using domestic chips including Alibaba Group Holding Ltd. and Huawei Technologies Co. to develop AI model training technology using a mixed-expert model (note: MoE, Mixture of experts) machine learning method. The report also mentioned that the training results rival NVIDIA's H800 chip. One of the people familiar with the matter revealed that Ant Group is still using NVIDIA chips for AI development, but now relies mainly on alternatives including AMD and domestic chips. The report suggests that this highlights the Chinese company's attempts to use local chips to replace<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[6061,239,1030],"collection":[],"class_list":["post-31446","post","type-post","status-publish","format-standard","hentry","category-news","tag-6061","tag-239","tag-1030"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/31446","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=31446"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/31446\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=31446"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=31446"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=31446"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=31446"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}