{"id":3889,"date":"2024-02-13T08:37:04","date_gmt":"2024-02-13T00:37:04","guid":{"rendered":"https:\/\/www.1ai.net\/?p=3889"},"modified":"2024-02-13T08:37:04","modified_gmt":"2024-02-13T00:37:04","slug":"%e9%a6%96%e6%ac%begh200%e4%b8%bb%e6%9c%ba%e5%bc%80%e5%8d%96%ef%bc%9a%e5%94%ae%e4%bb%b747500%e6%ac%a7%e5%85%83%e8%b5%b7%e3%80%81-%e4%b8%93%e4%b8%baai%e8%80%8c%e7%94%9f","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/3889.html","title":{"rendered":"The first GH200 console is now on sale: starting at 47,500 euros, designed specifically for AI"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-3890\" title=\"78310a55b319ebc45d6850d2f69844f11f1716b0\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/02\/78310a55b319ebc45d6850d2f69844f11f1716b0.jpeg\" alt=\"78310a55b319ebc45d6850d2f69844f11f1716b0\" width=\"600\" height=\"312\" \/><\/p>\n<p><a href=\"https:\/\/www.1ai.net\/en\/tag\/nvidia\" title=\"_OTHER ORGANISER\" target=\"_blank\" >NVIDIA<\/a>The official new generation GH200 Grace Hopper super chip platform is the world&#039;s first to use HBM3e high-bandwidth memory, and for the first time integrates the CPU and GPU on a single motherboard, which can meet the world&#039;s most complex generative AI workload requirements.<\/p>\n<p>Now, the first GH200 Grace Hopper<a href=\"https:\/\/www.1ai.net\/en\/tag\/pc\" title=\"[SEE ARTICLES WITH [PC] LABELS]\" target=\"_blank\" >PC<\/a>, appeared in a German store, and its dimensions are around 19 inches.<\/p>\n<p>In terms of positioning, it is obviously not suitable for games and traditional workstations. A more accurate positioning should be to focus on running large language models locally.<a href=\"https:\/\/www.1ai.net\/en\/tag\/ai%e5%b7%a5%e4%bd%9c%e7%ab%99\" title=\"_OTHER ORGANISER\" target=\"_blank\" >AI Workstation<\/a>.<\/p>\n<p><strong>This console is equipped with 480GB of LPDDR5X memory, plus the 96GB of HBM3 memory in the GH200 (or even 144GB of HBMe), for a total of 576GB-624GB.<\/strong><\/p>\n<p>In terms of price, the 96GB HBM3 memory version is priced at 47,500 euros (about 370,000 yuan) and is now available for purchase. The 144GB HBMe version is expected to be launched in the second quarter of 2024 and will be priced at 59,500 euros (about 460,000 yuan).<\/p>\n<p>The specific configuration of this AI workstation is that it is equipped with an NVIDIA GH200 Grace Hopper processor with 72 ARM cores (there is also a 144-core version) and a dedicated NVIDIA H100 accelerator. It is therefore equipped with dual 2000W power supplies and ultra-large-capacity storage, and supports a variety of connection expansion options (including NVIDIA Bluefield\/Connect-X).<\/p>\n<p>This PC has a cooling system that uses Noctua fans, add-ons, optional Nvidia Bluefield-3 and ConnectX-7 network cards, an 8TB SSD, a 30TB HDD, a mouse and keyboard, and even an RTX 4060.<\/p>\n<p>In terms of performance, the German company claims that this AI workstation can provide 67 teraFLOPS FP64, 989 teraFLOPS TF32, 1979 teraFLOPS FP16, 3958 teraFLOPS FP8, and 3958 TOPS INT8 computing performance.<\/p>\n<p><strong>In 23 tests of software such as Emerald Rapids and Bergamo, the conclusion is that the performance of the GH200 Grace CPU is comparable to that of the Intel Xeon Platinum 8592 Emerald Rapids processor.<\/strong><\/p>","protected":false},"excerpt":{"rendered":"<p>NVIDIA has officially launched the next generation GH200 Grace Hopper superchip platform, the world's first to utilize HBM3e high-bandwidth memory, and the first to fuse CPU and GPU on a single motherboard to meet the demands of the world's most complex generative AI loads. Today, the first PC with the GH200 Grace Hopper, in a form factor of around 19 inches, has shown up in a German store. In terms of positioning, it's clearly not appropriately designed for gaming and traditional workstations, but more accurately, it should be positioned as an AI workstation that focuses on running large language models locally. This mainframe comes with 480GB of LPDDR5X memory, counting the 96GB of HBM3 memory in the GH200 (or even 144GB of HBMe)<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[1236,189,532],"collection":[],"class_list":["post-3889","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-nvidia","tag-pc"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/3889","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=3889"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/3889\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=3889"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=3889"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=3889"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=3889"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}