{"id":14562,"date":"2024-07-03T09:21:19","date_gmt":"2024-07-03T01:21:19","guid":{"rendered":"https:\/\/www.1ai.net\/?p=14562"},"modified":"2024-07-03T09:21:19","modified_gmt":"2024-07-03T01:21:19","slug":"%e9%a9%ac%e6%96%af%e5%85%8b%ef%bc%9axai-%e8%ae%ad%e7%bb%83-grok-3-%e5%a4%a7%e6%a8%a1%e5%9e%8b%e7%94%a8%e4%ba%86-10-%e4%b8%87%e5%9d%97%e8%8b%b1%e4%bc%9f%e8%be%be-h100-%e8%8a%af%e7%89%87","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/14562.html","title":{"rendered":"Musk: xAI used 100,000 Nvidia H100 chips to train the Grok-3 large model"},"content":{"rendered":"<p data-vmark=\"9ae5\"><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e9%a9%ac%e6%96%af%e5%85%8b\" title=\"[View articles tagged with [Musk]]\" target=\"_blank\" >Musk<\/a>has announced that its artificial intelligence startup <a href=\"https:\/\/www.1ai.net\/en\/tag\/xai\" title=\"[View articles tagged with [xAI]]\" target=\"_blank\" >xA<\/a> 's big language model, Grok-2, will launch in August and will bring even more advanced AI capabilities. While Grok-2 hasn't been unveiled yet, Musk has already begun working on its <a href=\"https:\/\/www.1ai.net\/en\/tag\/grok-3\" title=\"[See articles with [Grok-3] labels]\" target=\"_blank\" >Grok-3<\/a> To build momentum.<\/p>\n<p data-vmark=\"0ad6\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-14563\" title=\"d37f51b3-98e5-4fee-a706-31f4135bbccc\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/07\/d37f51b3-98e5-4fee-a706-31f4135bbccc.png\" alt=\"d37f51b3-98e5-4fee-a706-31f4135bbccc\" width=\"596\" height=\"696\" \/><\/p>\n<p data-vmark=\"6c85\">Musk said that training AI chatbots requires datasets and that there is a lot of work involved in clearing large language models (LMMs) from existing data. He also addressed several issues with OpenAI model output training.<\/p>\n<p data-vmark=\"c548\">He revealed that xAI's Grok-3 cost 100,000 bucks.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%8b%b1%e4%bc%9f%e8%be%be\" title=\"Look at the article with the label\" target=\"_blank\" >Nvidia<\/a> The H100 chip for training is expected to be released by the end of the year and is believed to be \"very special\".<\/p>\n<p data-vmark=\"ddbe\">H100 is an AI chip developed by Weeda to process large language model (LLMs) data. The price per British Wida H100 artificial intelligence chip is estimated to be in the range of $3-40 million (currently around RMB 21.9\u2013292,000), with a possible discount on bulk purchases\u3002<\/p>\n<p data-vmark=\"de82\">As we can easily calculate, XAI here has $100,000 in British Weidar H100 worth $3-40 billion. Mask said that Tesla's purchases from Inverda this year are estimated to be in the range of $3-4 billion, and it's reasonable to assume that XAI is here to train with the British Wida chip that Tesra bought\u3002<\/p>","protected":false},"excerpt":{"rendered":"<p>Musk has announced that Grok-2, a large language model from his AI startup xAI, will launch in August and will bring even more advanced AI capabilities. While Grok-2 has yet to be unveiled, Musk has already started building momentum for his Grok-3. Musk said that training AI chatbots requires datasets and that it's a lot of work to clear large language models (LMMs) from existing data. He also addressed several issues with OpenAI model output training. He revealed that xAI's Grok-3, which was trained on 100,000 NVIDIA H100 chips and is expected to be released by the end of the year, is expected to be \"very special\". The H100 is a NVIDIA-developed chip specifically designed to handle large language models.<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[3311,356,239,355],"collection":[],"class_list":["post-14562","post","type-post","status-publish","format-standard","hentry","category-news","tag-grok-3","tag-xai","tag-239","tag-355"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/14562","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=14562"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/14562\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=14562"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=14562"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=14562"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=14562"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}