{"id":6965,"date":"2024-04-02T10:10:02","date_gmt":"2024-04-02T02:10:02","guid":{"rendered":"https:\/\/www.1ai.net\/?p=6965"},"modified":"2024-04-02T10:10:02","modified_gmt":"2024-04-02T02:10:02","slug":"%e6%98%86%e4%bb%91%e4%b8%87%e7%bb%b4%e5%ae%a3%e5%b8%83-4-%e6%9c%88-17-%e6%97%a5%e5%8f%91%e5%b8%83%e5%b9%b6%e5%bc%80%e6%ba%90%e5%a4%a9%e5%b7%a5%e5%a4%a7%e6%a8%a1%e5%9e%8b-3-0%ef%bc%9a","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/6965.html","title":{"rendered":"Kunlun Wanwei announced the release and open source of &quot;Tiangong Model 3.0&quot; on April 17: 400 billion parameters, claimed to have better performance than Grok 1.0"},"content":{"rendered":"<p data-vmark=\"44ac\"><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%98%86%e4%bb%91%e4%b8%87%e7%bb%b4\" title=\"[Sees articles with [Konlen] tags]\" target=\"_blank\" >Kunlun Wanwei<\/a>The group recently announced through its official WeChat account that<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%a7%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [large models]]\" target=\"_blank\" >Large Model<\/a>On the first anniversary of its release, Tiangong Model 3.0 will be officially launched on April 17<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%85%ac%e6%b5%8b\" title=\"[See articles with [public] labels]\" target=\"_blank\" >Public Beta<\/a>, and will select<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>&quot;Tiangong 3.0&quot; uses 400 billion-level parameter MoE (mixed expert model).<span class=\"accentTextColor\">Officially stated that it is one of the MoE models with the largest model parameters and the strongest performance in the world<\/span>, performance exceeds Grok 1.0.<\/p>\n<p data-vmark=\"b699\">It is reported that compared with the previous generation &quot;Tiangong 2.0&quot; MoE large model, &quot;Tiangong 3.0&quot; has &quot;amazing&quot; performance improvements in the fields of model semantic understanding, logical reasoning, versatility, generalization, uncertainty knowledge, and learning ability.<span class=\"accentTextColor\">Its model technology knowledge and ability has increased by more than 20%, and its mathematics\/reasoning\/coding\/cultural and creative abilities have increased by more than 30%.<\/span>.<\/p>\n<p data-vmark=\"803d\">&quot;Tiangong 3.0&quot; also adds<span class=\"accentTextColor\">Enhanced search, research mode, code calling and chart drawing, multiple calls to online search, etc.<\/span>, and specifically trained the model&#039;s Agent capabilities. It can independently complete planning, calling, and combining external tools and information, and can complete various complex needs such as industry analysis and product comparison.<\/p>\n<p data-vmark=\"4189\">\u201cTiangong 3.0\u201d<span class=\"accentTextColor\">Claimed to be the world&#039;s first multimodal &quot;super model&quot;<\/span>It integrates multiple capabilities including AI search, AI writing, AI long text reading, AI dialogue, AI speech synthesis, AI image generation, AI comic creation, AI image recognition, AI music generation, AI code writing, and AI table generation. The official calls it a &quot;super application in the era of big models.&quot;<\/p>\n<p data-vmark=\"3c39\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-6966\" title=\"443bfd88-873b-4a41-93b3-de9361965de0\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/04\/443bfd88-873b-4a41-93b3-de9361965de0.jpg\" alt=\"443bfd88-873b-4a41-93b3-de9361965de0\" width=\"953\" height=\"885\" \/><\/p>\n<p data-vmark=\"c6c6\">In October last year, Kunlun Wanwei open-sourced the &quot;Skywork&quot; Skywork-13B series of tens-billion-level large language models, and also open-sourced the 600GB, 150B Tokens open-source Chinese data set.<\/p>\n<p data-vmark=\"5e54\">Kunlun Wanwei&#039;s Skywork-13B series currently includes two models with 13 billion parameters:<span class=\"accentTextColor\">Skywork-13B-Base model, Skywork-13B-Math model<\/span>, the open source address is as follows:<\/p>\n<ul class=\"list-paddingleft-2\">\n<li>\n<p data-vmark=\"c5dd\">Skywork-13B download address (Model Scope):<a href=\"https:\/\/modelscope.cn\/organization\/skywork\" target=\"_blank\" rel=\"noopener\"><span class=\"link-text-start-with-http\">https:\/\/modelscope.cn\/organization\/skywork<\/span><\/a><\/p>\n<\/li>\n<li>\n<p data-vmark=\"3430\">Skywork-13B download address (Github):<a href=\"https:\/\/github.com\/SkyworkAI\/Skywork\" target=\"_blank\" rel=\"noopener\"><span class=\"link-text-start-with-http\">https:\/\/github.com\/SkyworkAI\/Skywork<\/span><\/a><\/p>\n<\/li>\n<\/ul>","protected":false},"excerpt":{"rendered":"<p>Kunlun World Wide Group has recently announced through its official public number that on the first anniversary of the release of the \"Tiangong\" model, Tiangong 3.0 will be officially put into public beta testing on April 17th and will be open-sourced at the same time. \"Tiangong 3.0 adopts MoE (Mixed Expert Model) with 400 billion parameters, which is officially said to be one of the world's largest model parameters and strongest MoE models, with performance exceeding that of Grok 1.0. According to the introduction, compared with the previous generation of Tiangong 2.0, Tiangong 3.0 is the most powerful MoE model in the world. Compared with the previous generation of \"Tiangong 2.0\" MoE model, \"Tiangong 3.0\" has \"amazing\" performance improvement in the areas of model semantic understanding, logical reasoning, generalization, uncertainty knowledge, learning ability, etc. Its model technology Knowledge capability increase over 20%, Math \/ Reasoning \/ Code<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[148,146],"tags":[830,216,219,1050],"collection":[],"class_list":["post-6965","post","type-post","status-publish","format-standard","hentry","category-headline","category-news","tag-830","tag-216","tag-219","tag-1050"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/6965","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=6965"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/6965\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=6965"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=6965"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=6965"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=6965"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}