{"id":42185,"date":"2025-09-02T10:53:25","date_gmt":"2025-09-02T02:53:25","guid":{"rendered":"https:\/\/www.1ai.net\/?p=42185"},"modified":"2025-09-02T11:04:00","modified_gmt":"2025-09-02T03:04:00","slug":"%e5%9c%a8%e5%9b%bd%e9%99%85%e6%af%94%e8%b5%9b%e4%b8%ad%e6%8b%bf%e4%b8%8b-30-%e4%b8%aa%e7%ac%ac-1-%e5%90%8d%ef%bc%8c%e8%85%be%e8%ae%af%e6%b7%b7%e5%85%83%e5%bc%80%e6%ba%90%e8%bd%bb%e9%87%8f%e7%ba%a7","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/42185.html","title":{"rendered":"Tencent open-sources lightweight translation model Hunyuan-MT-7B, which has won 30 1st places in international competitions."},"content":{"rendered":"<p>September 1 News.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%85%be%e8%ae%af%e6%b7%b7%e5%85%83\" title=\"[View articles tagged with [Tencent Hybrid]]\" target=\"_blank\" >Tencent Hunyuan<\/a>announced the transfer of its international<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e7%bf%bb%e8%af%91%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [translation model]]\" target=\"_blank\" >translation model<\/a><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>The model is available for developers to download and deploy for free. It is understood that this model is named Hunyuan-MT-7B, the total number of references is only 7B, and supports 33 languages, 5 kinds of Chinese and civil languages\/dialects translation, which is a lightweight translation model with comprehensive capabilities.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-42186\" title=\"eb298419j00t1xxc9000sd000u000hlp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/09\/eb298419j00t1xxc9000sd000u000hlp.jpg\" alt=\"eb298419j00t1xxc9000sd000u000hlp\" width=\"1080\" height=\"633\" \/><\/p>\n<p><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%85%be%e8%ae%af\" title=\"[View articles tagged with [Tencent]]\" target=\"_blank\" >Tencent<\/a>According to Shy-Hunyuan, Tencent's Hunyuan-MT-7B (entry name: Shy-hunyuan-MT) took the absolute lead in the WMT2025 competition of the International Association for Computational Linguistics (ACL), which ended at the end of August, taking the 1st place in 30 out of the 31 language competitions, which not only include Chinese, English, Japanese and other common languages, but also include Czech, Marathi, Estonian, Icelandic and other minor languages. The WMT25 competition has a clear limit on the parameter size of the participating models, requiring the system to meet the open source requirements, and can only use public data for training, in such an environment, Hunyuan-MT-7B defeated many models with larger parameters.<\/p>\n<p>Tencent Hybrid also claims that the Tencent Hunyuan-MT-7B model has excellent performance on Flores200, a commonly used dataset in the industry for translation proficiency evaluation, which is significantly ahead of models of the same size, and is no less effective when compared with super-large-sized models. For translation scenarios, Tencent Hunyuan proposes a complete translation model training paradigm, covering the entire chain from pre-training, to CPT, to supervised parameterization, translation enhancement, and integration enhancement, which makes the model's translation effect the best in the industry.<\/p>\n<p>1AI noted that at the same time the open source also has a translation integration model Hunyuan-MT-Chimera-7B (Chimera), is the industry's first translation integration model, which can be based on the original text and the different content given by multiple translation models, and then generate a more optimal translation results, not only the native support for Hunyuan-MT-7B, but also support access to the deepseek and other It not only supports Hunyuan-MT-7B natively, but also supports access to deepseek and other models, which can provide more accurate responses to users and scenarios with professional translation needs.<\/p>\n<p>Currently, Tencent's hybrid translation model has been connected to a number of Tencent's businesses, including Tencent Conference, Enterprise WeChat, QQ Browser, Translator Translation, Tencent's Overseas Customer Service Translation, etc., which helps to improve the product experience.<\/p>\n<p>The Hunyuan-MT-7B model is now available on the Tencent Mixed Meta website, and is available for download in open source communities such as Huggingface and Github, and the corresponding technical reports and papers have been made public in the open source community.<\/p>\n<p><strong>Experience Address<\/strong>:<\/p>\n<section>https:\/\/hunyuan.tencent.com\/modelSquare\/home\/list<\/section>\n<section>GitHub: https:\/\/github.com\/Tencent-Hunyuan\/Hunyuan-MT\/<\/section>\n<section>HugginFace: https:\/\/huggingface.co\/collections\/tencent\/hunyuan-mt-68b42f76d473f82798882597<\/section>\n<section>AngelSlim compression tool: https:\/\/github.com\/Tencent\/AngelSlim<\/section>","protected":false},"excerpt":{"rendered":"<p>September 1 news, Tencent mixed yuan announced its international translation model open source, for developers to download and deploy for free. It is understood that this model is named Hunyuan-MT-7B, the total number of references is only 7B, supports 33 languages, 5 kinds of Chinese language \/ dialect translation, is a lightweight translation model with comprehensive capabilities. According to Tencent Hybrid, in the International Association for Computational Linguistics (ACL) WMT2025 competition that ended at the end of August, Tencent Hybrid's Hunyuan-MT-7B (Entry Name: Shy-hunyuan-MT) took the 1st place in 30 out of the 31 language competitions, taking the absolute lead. These 31 languages not only include Chinese, English, Japanese and other common languages, but also Czech<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[219,7510,323,2657],"collection":[],"class_list":["post-42185","post","type-post","status-publish","format-standard","hentry","category-news","tag-219","tag-7510","tag-323","tag-2657"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/42185","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=42185"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/42185\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=42185"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=42185"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=42185"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=42185"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}