{"id":13188,"date":"2024-06-15T09:11:19","date_gmt":"2024-06-15T01:11:19","guid":{"rendered":"https:\/\/www.1ai.net\/?p=13188"},"modified":"2024-06-15T09:11:19","modified_gmt":"2024-06-15T01:11:19","slug":"%e5%b8%88%e8%80%85ai%e6%90%ba%e6%89%8b%e6%91%a9%e5%b0%94%e7%ba%bf%e7%a8%8b%e5%ae%8c%e6%88%9070%e4%ba%bf%e5%8f%82%e6%95%b0%e5%a4%a7%e6%a8%a1%e5%9e%8b%e8%ae%ad%e7%bb%83","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/13188.html","title":{"rendered":"Shizhe AI and Moore Thread complete the training of a large model with 7 billion parameters"},"content":{"rendered":"<p>recently,<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%91%a9%e5%b0%94%e7%ba%bf%e7%a8%8b\" title=\"_Other Organiser\" target=\"_blank\" >Moore Thread<\/a>&quot;The AI Big Model for All-Subject Education&quot;<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%b8%88%e8%80%85ai\" title=\"[SEES ARTICLES WITH [TEACHER AI] LABEL]\" target=\"_blank\" >Teacher AI<\/a>&quot;The two sides jointly announced that they have completed<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%a7%e6%a8%a1%e5%9e%8b%e8%ae%ad%e7%bb%83\" title=\"[Sees articles with labels of the Great Model Training]\" target=\"_blank\" >Large model training<\/a>Test. Relying on the Moore Thread Kua&#039;e (KUAE) Qianka Intelligent Computing Cluster, Shizhe AI successfully completed the high-intensity training of a large model with 7 billion parameters in one week, and the training efficiency reached the expected level, fully demonstrating the capabilities of the domestic full-featured GPU Qianka training platform.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-13189\" title=\"202405161743228569_18\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/06\/202405161743228569_18.jpg\" alt=\"202405161743228569_18\" width=\"1000\" height=\"666\" \/><\/p>\n<p>Source Note: The image is generated by AI, and the image is authorized by Midjourney<\/p>\n<p>It is understood that &quot;Shizhe AI&quot; was founded in 2020. The core team comes from Tsinghua University and focuses on the large-scale education model of all disciplines. Since the opening of internal testing, it has more than 25,000 users, supports more than 30 disciplines, and covers more than 2,000 textbooks.<\/p>\n<p>This training test successfully verified the powerful performance of Moore&#039;s Threads and K-Card Intelligent Computing Cluster in large model training, laying the foundation for future innovations in educational AI large models. Both parties will continue to carry out adaptation work on large model reasoning and optimize technology to meet high-frequency reasoning needs.<\/p>\n<p>Liu Chunjiang, CEO of Shizhe Big Model, said: &quot;This training test demonstrated the powerful performance of the Kua&#039;e Qianka Intelligent Computing Cluster. We are full of confidence in the domestic computing power. In the future, Shizhe Big Model will continue to run more core businesses on the Kua&#039;e Qianka Intelligent Computing Cluster to provide users with efficient and stable computing services.&quot;<\/p>","protected":false},"excerpt":{"rendered":"<p>Recently, Moore Threads and Teacher AI, an AI model for education in all subjects, jointly announced that they have completed a large model training test. Relying on Moore Threads' KUAE Kilocalorie cluster, Teacher AI successfully completed the high-intensity training of the 7 billion parameter large model, which took one week, and the training efficiency met expectations, fully demonstrating the capability of the domestic full-featured GPU Kilocalorie training platform. Figure source note: picture generated by AI, picture authorization service provider Midjourney It is understood that \"teacher AI\" was founded in 2020, the core team from Tsinghua University, focusing on the education of the whole discipline of the big model. Since the open internal test, it has more than 25,000 users, supports more than 30 disciplines of knowledge, and covers more than 2,000 textbooks. The training test successfully verified the Moore's Threads<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[3082,3080,3081],"collection":[],"class_list":["post-13188","post","type-post","status-publish","format-standard","hentry","category-news","tag-3082","tag-ai","tag-3081"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/13188","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=13188"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/13188\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=13188"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=13188"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=13188"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=13188"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}