{"id":1206,"date":"2023-11-11T18:18:21","date_gmt":"2023-11-11T10:18:21","guid":{"rendered":"https:\/\/www.1ai.net\/?p=1206"},"modified":"2023-11-11T18:18:21","modified_gmt":"2023-11-11T10:18:21","slug":"%e6%98%8e%e5%b9%b4%e5%bc%80%e6%ba%90%e5%85%a8%e9%83%a8%e5%ba%95%e5%b1%82%e4%bb%a3%e7%a0%81%ef%bc%81%e7%94%b5%e4%bf%a1%e5%8f%91%e5%b8%83%e5%8d%83%e4%ba%bf%e5%8f%82%e6%95%b0%e5%a4%a7%e6%a8%a1%e5%9e%8b","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/1206.html","title":{"rendered":"All underlying codes will be open source next year! Telecom releases a large model with hundreds of billions of parameters, &quot;Xingchen Semantics&quot;"},"content":{"rendered":"<p>November 10 news, today, China<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e7%94%b5%e4%bf%a1\" title=\"[Sees articles with [telecommunications] labels]\" target=\"_blank\" >telecommunications<\/a>The 2023 Digital Technology Ecosystem Conference and 2023 Digital Technology Ecosystem Exhibition officially kicked off in Guangzhou. Several senior executives of Telecom took turns to release a number of product, platform and technology updates.<\/p>\n<p class=\"article-content__img\"><img decoding=\"async\" src=\"https:\/\/img1.mydrivers.com\/img\/20231110\/S440a03f3-e54e-4467-8deb-d927ba792b43.jpg\" alt=\"All underlying codes will be open source next year! Telecom releases a large model with hundreds of billions of parameters, &quot;Xingchen Semantics&quot;\" \/><\/p>\n<p>Among them, He Zhongjiang, General Manager of China Telecom Artificial Intelligence Technology, officially released<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%98%9f%e8%be%b0%e8%af%ad%e4%b9%89\" title=\"[Sees articles with labels]\" target=\"_blank\" >Star Semantics<\/a>\u201d Hundreds of billions of parameters<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%a7%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [large models]]\" target=\"_blank\" >Large Model<\/a>.<\/p>\n<p>It is reported that the Xingchen Semantic Big Model is an upgrade of China Telecom&#039;s self-developed big model, with the number of parameters increased from millions to hundreds of billions, and all capabilities have been significantly improved.<\/p>\n<p>He Zhongjiang said that Xingchen Semantic has more than 1.2 billion style data, the training video memory is reduced by 50%, and the reasoning speed is increased by 4.5 times; the Chinese image understanding and generation capabilities are improved by 30%, and the semantic fine-grained generation effect is improved by 25%.<\/p>\n<p>In terms of creative efficiency improvement, the production time of Xingchen Semantic is reduced by 92% compared with previous production tools, and the design cost is reduced by 95%.<\/p>\n<p>Overall, the effect of the 100 billion model has been significantly improved. Next, we will use quantitative distillation to make the model commercially available at a low cost.\u201d<\/p>\n<p>Finally, he also said that China Telecom&#039;s AI team will also participate<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>The 10 billion model will be open sourced by the end of this year, and the 100 billion model will be open sourced in April next year. All underlying codes will be open sourced. At the same time, more than 1TB of high-quality cleaned data and various tool chains based on the Xingchen large model base will be open to meet the needs of various users.<\/p>","protected":false},"excerpt":{"rendered":"<p>November 10 news, today, China Telecom 2023 digital technology ecological conference and 2023 digital technology ecological exhibition in guangzhou officially opened, telecom multiple executives took turns to release a number of products, platforms and technology updates. Among them, China Telecom Artificial Intelligence Technology General Manager He Zhongjiang officially released the Star Semantic \"100 billion parameters of the big model. According to reports, the Star Semantic model is an upgrade of China Telecom's self-developed model, from the previous million parameters to 100 billion, and all the capabilities have been significantly improved. He Zhongjiang said, Star Semantics has more than 1.2 billion style data, training memory reduced 50%, reasoning 4.5 times faster; Chinese imagery comprehension generating capacity to improve 30%, semantic fine-grained generation effect to improve 25%. In terms of creative efficiency, the Star Semantics production time compared to the previous generation of<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[216,219,397,396],"collection":[],"class_list":["post-1206","post","type-post","status-publish","format-standard","hentry","category-news","tag-216","tag-219","tag-397","tag-396"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/1206","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=1206"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/1206\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=1206"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=1206"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=1206"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=1206"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}