{"id":52776,"date":"2026-05-09T11:37:08","date_gmt":"2026-05-09T03:37:08","guid":{"rendered":"https:\/\/www.1ai.net\/?p=52776"},"modified":"2026-05-09T11:37:08","modified_gmt":"2026-05-09T03:37:08","slug":"%e7%99%be%e5%ba%a6%e5%8f%91%e5%b8%83%e6%96%87%e5%bf%83%e5%a4%a7%e6%a8%a1%e5%9e%8b-5-1%ef%bc%9a%e6%90%9c%e7%b4%a2%e8%83%bd%e5%8a%9b%e4%bd%8d%e5%b1%85%e5%9b%bd%e5%86%85%e9%a6%96%e4%bd%8d%ef%bc%8c","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/52776.html","title":{"rendered":"100-DEGREE RELEASE LARGE-SCALE MODEL 5.1: SEARCH CAPABILITY IS THE HIGHEST IN THE COUNTRY, WITH PRE-TRAINING COSTS ONLY FOR INDUSTRY 6%"},"content":{"rendered":"<p>May 9 News.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e7%99%be%e5%ba%a6\" title=\"[Sees articles containing [100 degrees] labels]\" target=\"_blank\" >Baidu<\/a>A new generation of large models was released -- <a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%96%87%e5%bf%83%e5%a4%a7%e6%a8%a1%e5%9e%8b\" title=\"[Sees articles with labels]\" target=\"_blank\" >Wenxin Large Model<\/a> 5.1. At present, Mansion 5.1 is on line in the 100-degree thousand-show model square and the Mansion Network, open to business users and developers\u3002<\/p>\n<p>According to the 100-degree official presentation, the model uses \u201cmulti-dimensional elastic pre-training\u201d technology, with only the pre-training costs of the industry-size model of approximately 6%, leading to basic results and making it to LMArena's first and fourth in the world\u3002<\/p>\n<p>5.1 THE SIGNIFICANT INCREASE IN INTEGRATED CAPABILITIES BENEFITS FROM KEY TECHNOLOGIES SUCH AS \u201cMULTI-DIMENSIONAL ELASTIC PRE-TRAINING\u201d. THE TECHNOLOGY WAS PRESENTED AT THE TIME OF ITS LAUNCH, AND A TRAINING EXERCISE WAS ACHIEVED TO GENERATE MULTI-SCALE MODELS. 5.1 AS A PHASED RESULT OF THE TECHNOLOGY, THE KNOWLEDGE OF THE CULTURE OF 5.0 IS FULLY INHERITED AND THE OVERALL PARAMETERS ARE CONDENSED TO ABOUT ONE THIRD, THE ACTIVATING PARAMETERS ARE CONDENSED TO ABOUT ONE HALF, AND ONLY THE PRE-TRAINING COSTS OF THE INDUSTRY-SIZE MODEL ABOUT 6% ARE USED TO ACHIEVE THE BASE EFFECTS\u3002<\/p>\n<p>According to 100 degrees, the model has performed well in a number of industry authoritative benchmarking tests, of which Agent ' s increased capacity is more evident, surpassing DeepSeek-V4-Pro, and creative writing is comparable to Gemini 3.1 Pro, and the reasoning is close to industry ' s leading closed source model\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-52777\" title=\"b646445ej00ter3cd005ed000v90lqp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2026\/05\/b646445ej00ter3cd005ed000v900lqp.jpg\" alt=\"b646445ej00ter3cd005ed000v90lqp\" width=\"1125\" height=\"782\" \/><\/p>\n<p>The most recent ranking of the International Authority Grand Model Moot, LMArena, shows that Manshin 5.1 was the first, fourth, and only national production model in the country at 1223\u3002<\/p>\n<p>It is expressed that its search capability refers to the ability of large models to quickly retrieve, integrate and generate multi-source information and to be able to output more consistent and reliable. This means that a robust search capability model can assume the role of information integration and processing in complex business scenarios, with greater potential for application in content generation, smart assistants, enterprise knowledge management and Agent applications\u3002<\/p>\n<p>Previously, the 5.0 series model had been on the LMArena text and visual understanding list on a number of occasions and had been on the first ladder of the stabilization model. Earlier, on April 30, Manshin 5.1 Preview was ranked first in the LMArena text at 1476, surpassing the main domestic and international models GPT-55, DeepSeek-V4-Pro, and the only national production model under the top 15\u3002<\/p>\n<p>According to the LMArena platform rules, the list is modelled blindly based on global real user experience data, and Elo ratings are directly determined by the user vote and have high industry credibility\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-52778\" title=\"1fe6c512j00ter3d60055d000v90rp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2026\/05\/1fe6c512j00ter3d60055d000v900rrp.jpg\" alt=\"1fe6c512j00ter3d60055d000v90rp\" width=\"1125\" height=\"999\" \/><\/p>\n<p>Create 2026-degree AI Developer Conference will be held from 13 to 14 May at the Beijing National Convention Center, 2nd edition, with a presentation by Lee Han Hong. At that time, it is expected that further disclosure of the greater technical details of the Macro Model 5.1 and commercial landing planning will be available\u3002<\/p>","protected":false},"excerpt":{"rendered":"<p>On May 9th, 100% of the news released a new generation of large-scale basic models - the Mandarin model 5.1. At present, Mansion 5.1 is on line in the 100-degree thousand-show model square and the Mansion Web, open to business users and developers. According to the 100-degree official presentation, the model uses \u201cmulti-dimensional elastic pre-training\u201d technology, with only the pre-training costs of the industry-size model of approximately 6%, leading to basic results and making it to LMArena's first and fourth in the world. 5.1 The significant increase in integrated capabilities benefits from key technologies such as \u201cmulti-dimensional elastic pre-training\u201d. The technology was presented at the time of its launch, and a training exercise was achieved to generate multi-scale models. 5.1 As a step in the way of this technology, it has fully inherited it 5<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[272,234],"collection":[],"class_list":["post-52776","post","type-post","status-publish","format-standard","hentry","category-news","tag-272","tag-234"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/52776","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=52776"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/52776\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=52776"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=52776"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=52776"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=52776"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}