{"id":15299,"date":"2024-07-11T09:09:10","date_gmt":"2024-07-11T01:09:10","guid":{"rendered":"https:\/\/www.1ai.net\/?p=15299"},"modified":"2024-07-11T09:09:10","modified_gmt":"2024-07-11T01:09:10","slug":"grok2%e5%8d%b3%e5%b0%86%e5%8f%91%e5%b8%83-xai%e5%8a%a0%e9%80%9fai%e7%ab%9e%e8%b5%9b%ef%bc%9a10%e4%b8%87gpu%e8%b6%85%e7%ae%97%e6%9c%ac%e6%9c%88%e5%ba%95%e4%ba%a4%e4%bb%98","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/15299.html","title":{"rendered":"Grok2 is about to release xAI to accelerate AI competition: 100,000 GPU supercomputers will be delivered by the end of this month"},"content":{"rendered":"<p><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e9%a9%ac%e6%96%af%e5%85%8b\" title=\"[View articles tagged with [Musk]]\" target=\"_blank\" >Musk<\/a>On July 9, it announced that its artificial intelligence company<a href=\"https:\/\/www.1ai.net\/en\/tag\/xai\" title=\"[View articles tagged with [xAI]]\" target=\"_blank\" >xA<\/a>Building a 100,000-chip Nvidia H100<a href=\"https:\/\/www.1ai.net\/en\/tag\/gpu\" title=\"_OTHER ORGANISER\" target=\"_blank\" >GPU<\/a>The move marks the end of xAI&#039;s talks with Oracle to expand its existing agreement to lease more Nvidia chips.<\/p>\n<p>Mask stressed that this would be \u201cthe most powerful training cluster on a global scale, with a huge advantage in the lead\u201d. He stated that the core competitiveness of xAI lies in speed, \u201cit is the only way to close the gap\u201d\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-15300\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/07\/6385620081410482239411871.png\" alt=\"\" width=\"661\" height=\"472\" \/><\/p>\n<p>Prior to this, xAI had rented the computing power of 24,000 H100 chips from Oracle for training<a href=\"https:\/\/www.1ai.net\/en\/tag\/grok2\" title=\"[See articles with [Grok2] labels]\" target=\"_blank\" >Grok2<\/a>Musk revealed that Grok2 is currently in the final polishing stage and is expected to be released as early as next month.<\/p>\n<p>Despite the termination of the expanded cooperation, Mask praised Oracle as \u201ca great company\u201d. He stressed that when the fate of the company depended on speed, \u201cwe had to take control of the wheel ourselves and not just sit in the back seat\u201d\u3002<\/p>\n<p>It is worth noting that in May this year, it was reported that xAI and Oracle were close to reaching an expanded cooperation agreement worth approximately $10 billion. However, this announcement seems to indicate that xAI is shifting to a strategy of building its own AI infrastructure.<\/p>\n<p>This decision reflects the fierce competition in the AI field and the key role of top computing resources in this competition. With the deployment of xAI&#039;s new supercomputer, the industry will pay close attention to its effect on improving the performance of AI models.<\/p>","protected":false},"excerpt":{"rendered":"<p>Mask announced on 9 July that its artificial intelligence company XAI was building a supercomputer with 100,000 British Virgin H100 GPU, which is expected to be delivered and begin training by the end of this month. This initiative marks the end of XAI's negotiations with Oracle to expand the existing agreement and to rent more British Wida chips. Mask stressed that this would be \u201cthe most powerful training cluster on a global scale, with a huge advantage in the lead\u201d. He stated that the core competitiveness of xAI lies in speed, \u201cit is the only way to close the gap\u201d. Prior to this, XAI had leased 24,000 H100 chips from Oracle to train Grok2. Mask says Grok2 is in the final grinding phase<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[415,3433,356,355],"collection":[],"class_list":["post-15299","post","type-post","status-publish","format-standard","hentry","category-news","tag-gpu","tag-grok2","tag-xai","tag-355"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/15299","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=15299"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/15299\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=15299"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=15299"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=15299"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=15299"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}