{"id":52374,"date":"2026-04-23T11:27:29","date_gmt":"2026-04-23T03:27:29","guid":{"rendered":"https:\/\/www.1ai.net\/?p=52374"},"modified":"2026-04-23T11:40:36","modified_gmt":"2026-04-23T03:40:36","slug":"openai-%e5%8f%91%e5%b8%83-chatgpt-%e5%9b%a2%e9%98%9f%e5%b7%a5%e4%bd%9c%e6%b5%81-ai-%e6%99%ba%e8%83%bd%e4%bd%93%ef%bc%8c%e8%87%aa%e5%8a%a8%e5%8c%96%e5%a4%84%e7%90%86%e5%a4%8d%e6%9d%82%e4%bb%bb%e5%8a%a1","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/52374.html","title":{"rendered":"Qwen3.6-27B Declares open source: 27 billion argument dense models with programming capabilities exceeding 15 times the size of MoE models"},"content":{"rendered":"<p>Message of April 23rd, Aliyun<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e9%80%9a%e4%b9%89%e5%8d%83%e9%97%ae\" title=\"[View articles tagged with [Tongyi Thousand Questions]]\" target=\"_blank\" >Thousand Questions on Tongyi<\/a>The team announced yesterday<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>Model family greets new members \u2014 Qwen 3.6-27B. It's a dense section of 27 billion parameters<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%9a%e6%a8%a1%e6%80%81%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [multimodal model]]\" target=\"_blank\" >Multimodal Model<\/a>It is also the model specification of the highest community voices\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-52390\" title=\"b28de40cj00txgug00h9d000uwp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2026\/04\/b28de40cj00tdxgug00h9d000u000gwp.jpg\" alt=\"b28de40cj00txgug00h9d000uwp\" width=\"1080\" height=\"608\" \/><\/p>\n<p>This was followed by the release of Qwen3.6-Plus and Qwen3.6-35B-A3B, and this open-source version of 27B, while maintaining the advantages of a dense architecture, has resulted in a comprehensive upgrading of intelligent body programming and multimodular reasoning\u3002<\/p>\n<p>According to the official presentation, Qwen3.6-27B, which supports multi-modular and non-thinking models, has achieved a flag-size performance in intelligent body programming, which goes beyond the former Open Source Flag Qwen 3.5-397B-A17B - a total parameter of 39.7 billion and a activated parameter of 17 billion MoE (mixed experts) model. As a dense structure, Qwen 3.6-27B deployment without MoE route is the ideal option for developers to acquire top-programming capabilities on a practical and broadly deployable scale\u3002<\/p>\n<p>In the natural language and programming baseline test, Qwen3.6-27B fully exceeded its 15-fold Qwen3.5-397B-A17B on all major programming benchmarks with 27 billion parameters alone\u3002<\/p>\n<p>Specifically, SWE-bench Verified scores 77.2 (former 76.2), SWE-bench Pro score 53.5 (former 509), Terminal-Bench 2.0 score 59.3 (former 52.5), SkillsBench score 48.2 (former 30.0)\u3002<\/p>\n<p>On the reasoning mission, Qwen3.6-27B achieved 87.8 on GPQA Diamond, which is comparable to a model of several times its size\u3002<\/p>\n<p>In terms of visual language, the model is home-grown in multi-modular form, capable of processing mixed input of images, videos and text, supporting tasks such as visual reasoning, document understanding and visual questions and answers, and is consistent with Qwen3.6-35B-A3B\u3002<\/p>\n<p>1AI cautions that the open source weights for Qwen3.6-27B are available on the Hugging Face and ModelScope platforms and can be downloaded by developers for local deployment. At the same time, users can have a direct interactive dialogue on Qwen Studio (chat.qwen.ai)\u3002<\/p>\n<p>In addition, the ARI platform is about to support the call to the model through API and to retain the \"preserve_thinking\" function, so that all pre-sequent thinking can be retained in messages and officially recommended for use in intelligent missions\u3002<\/p>\n<p>The model could also be seamlessly integrated into popular third-party programming assistants, including OpenClaw, Claude Code and Qwen Code, thus simplifying the development process and achieving an efficient and context-sensitive coding experience\u3002<\/p>","protected":false},"excerpt":{"rendered":"<p>On April 23rd, the Ali Yuntung team announced yesterday that the open source model family had received new members \u2014 Qwen 3.6-27B. This is a dense, multi-modular model with 27 billion parameters and the highest community call for model specifications. This was followed by the release of Qwen3.6-Plus and Qwen3.6-35B-A3B, and this open-source version of 27B, while maintaining the advantages of a dense architecture, has resulted in a comprehensive upgrading of intelligent body programming and multimodular reasoning. According to official sources, Qwen3.6-27B supports multi-modular thinking and non-thinking models, achieving a flag-size performance in intelligent body programming, which goes beyond the former Open Source Flagship Qwen3.5-397B-A17B --<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[1096,219,331],"collection":[],"class_list":["post-52374","post","type-post","status-publish","format-standard","hentry","category-news","tag-1096","tag-219","tag-331"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/52374","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=52374"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/52374\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=52374"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=52374"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=52374"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=52374"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}