{"id":38506,"date":"2025-06-30T12:03:36","date_gmt":"2025-06-30T04:03:36","guid":{"rendered":"https:\/\/www.1ai.net\/?p=38506"},"modified":"2025-06-30T12:03:36","modified_gmt":"2025-06-30T04:03:36","slug":"%e5%8d%8e%e4%b8%ba%e5%ae%a3%e5%b8%83%e5%bc%80%e6%ba%90%e7%9b%98%e5%8f%a4-7b-%e7%a8%a0%e5%af%86%e5%92%8c-72b-%e6%b7%b7%e5%90%88%e4%b8%93%e5%ae%b6%e6%a8%a1%e5%9e%8b","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/38506.html","title":{"rendered":"Huawei Announces Open Source Pangu 7B Dense and 72B Hybrid Expert Models"},"content":{"rendered":"<p>June 30th.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%8d%8e%e4%b8%ba\" title=\"_Other Organiser\" target=\"_blank\" >Huawei<\/a>Today's official announcement<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>Pangu (creator of the universe in Chinese mythology)\u00a0<strong>Dense model with 7 billion parameters, Pangu Pro MoE 72 billion parameters<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%b7%b7%e5%90%88%e4%b8%93%e5%ae%b6%e6%a8%a1%e5%9e%8b\" title=\"[Sees articles with labels of [mixed expert model]\" target=\"_blank\" >hybrid expert model<\/a>and Rise-based model inference techniques<\/strong>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-38507\" title=\"5f6ec7caj00synhx7001zd000lh00dxp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/06\/5f6ec7caj00synhx7001zd000lh00dxp.jpg\" alt=\"5f6ec7caj00synhx7001zd000lh00dxp\" width=\"773\" height=\"501\" \/><\/p>\n<p>Huawei said, \"This move is another key initiative for Huawei to practice the Rise ecological strategy, to promote the research and innovative development of big model technology, and to accelerate the application and value creation of AI in thousands of industries.\"<\/p>\n<ul>\n<li>Pangu Pro MoE 72B model weights, underlying inference code, is now officially available on the open source platform.<\/li>\n<li>Rise-based inference code for ultra-large-scale MoE models has been officially launched on the open source platform.<\/li>\n<li>The Pangu 7B correlation model weighting and inference code will be available on the open source platform in the near future.<\/li>\n<\/ul>\n<p>1AI Attached open source address:<\/p>\n<p>https:\/\/gitcode.com\/ascend-tribe<\/p>","protected":false},"excerpt":{"rendered":"<p>On June 30, China officially announced today a dense model of the 7 billion-billion-dollar open source parameters, a hybrid expert model of the 72 billion-billion-billion-dollar Pro MoE parameters and model reasoning techniques based on adoration. \u201cThis is another key initiative for China to implement its ecological strategy, to promote research and innovation in large model technologies and to accelerate the application and value creation of artificial intelligence in a wide range of fields.\u201d Pro MoE 72B model weights, underlying reasoning codes, officially on-line open-source platform. Based on the hyper-large MoE model reasoning code, it has been officially launched on the open source platform. The 7B related model weights and reasoning codes will be on the open-source platform in the near future. 1AI with open source address: https:\/\/g<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[1117,219,5783],"collection":[],"class_list":["post-38506","post","type-post","status-publish","format-standard","hentry","category-news","tag-1117","tag-219","tag-5783"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/38506","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=38506"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/38506\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=38506"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=38506"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=38506"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=38506"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}