{"id":3118,"date":"2024-01-23T09:46:38","date_gmt":"2024-01-23T01:46:38","guid":{"rendered":"https:\/\/www.1ai.net\/?p=3118"},"modified":"2024-01-23T09:46:38","modified_gmt":"2024-01-23T01:46:38","slug":"%e5%82%85%e7%9b%9b%e5%8f%91%e5%b8%83%e7%8c%8e%e6%88%b7%e6%98%9f%e7%a9%ba%e5%a4%a7%e6%a8%a1%e5%9e%8borion-14b-%e6%8b%a5%e6%9c%89140%e4%ba%bf%e5%8f%82%e6%95%b0%e8%a7%84%e6%a8%a1","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/3118.html","title":{"rendered":"Fu Sheng releases Orion-14B, a large model of the Orion sky with 14 billion parameters"},"content":{"rendered":"<p>On January 21, Orion Starry Sky<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%82%85%e7%9b%9b\" title=\"[Sees articles containing [Fussing] labels]\" target=\"_blank\" >Fu Sheng<\/a>2024 Opening AI Lecture and<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e7%8c%8e%e6%88%b7%e6%98%9f%e7%a9%ba%e5%a4%a7%e6%a8%a1%e5%9e%8b\" title=\"Look at the article that contains the label of Orion\" target=\"_blank\" >Orion Starry Sky Model<\/a>The Orion-14B model was released at the conference. This is a pre-trained multi-language<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%a7%e8%af%ad%e8%a8%80%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [large language model]]\" target=\"_blank\" >Large Language Model<\/a>, with 14 billion parameters, covering common languages and professional terms, and has achieved the same level of model performance on multiple third-party test sets.<span class=\"spamTxt\">optimal<\/span>Effect.<\/p>\n<p>The features of the Orion Starry Sky model include: support for ultra-long texts, up to 320K tokens; inference speed of 31 tokens\/s on a thousand-yuan graphics card; excellent multi-language capabilities, especially in Japanese and Korean; the model size is reduced by 70% after quantization technology processing, and the performance is almost lossless.<\/p>\n<p class=\"article-content__img\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-3119\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/01\/6384159856687961384128317.jpg\" alt=\"\" width=\"665\" height=\"363\" \/><\/p>\n<p>In order to meet the application needs of enterprises, Orion Star has also launched a fine-tuning family bucket, including RAG (retrieval enhancement generation) and Agent fine-tuning models. The RAG suite can quickly integrate the enterprise&#039;s own knowledge base to build customized applications; the Agent suite can call the most suitable tool according to the user&#039;s problem to solve more complex problems.<\/p>\n<p>In addition to launching large models and fine-tuning models, Orion Star has also launched applications such as Juyan Human Resources Assistant, Juyan Cloud Asset Assistant and Juyan Creative Assistant to help companies improve operational efficiency and decision-making capabilities.<\/p>\n<p>At the press conference, Fu Sheng also emphasized that enterprises need not only big models, but also big model applications that can solve pain points in combination with business processes. Orion Star helps enterprises achieve AI-assisted decision-making by providing a one-stop solution for AI big model consulting and services.<\/p>\n<p>The release of the large model of Orion Star is one of the results of its continuous tracking of the evolution of AI technology and huge investment in research and development over the years.<span class=\"spamTxt\">Top<\/span>The team of algorithm scientists and 2 billion user-level application experience worldwide have accumulated a large amount of user data and token data, providing a solid foundation for R&amp;D and optimization of models.<\/p>\n<p>Orion Star is currently training a hybrid expert model based on the MoE architecture, and the next milestone is an intelligent model with tens of billions of parameters.<\/p>\n<p><strong>Open source address:<\/strong><\/p>\n<p>https:\/\/github.com\/OrionStarAI\/Orion<\/p>\n<p>https:\/\/huggingface.co\/OrionStarAI<\/p>","protected":false},"excerpt":{"rendered":"<p>On January 21, Orion Star released the Orion Star Big Model (Orion-14B) at the Fu Sheng 2024 Kickoff AI Class and Orion Star Big Model Launch. It is a pre-trained multi-language big language model developed by Orion Star with 14 billion parameter scale, covering common languages and specialized terminology, and has achieved the best results in its class on multiple third-party test sets. The features of Orion's large model include: support for ultra-long text, up to 320K token; inference speed of 31 token\/s on a thousand-dollar graphics card; excellent multi-language capabilities, especially in Japanese and Korean; the model size is reduced to 70% after quantization technology, with virtually no loss in performance. In order to meet the application needs of enterprises, Orion Star also launched the<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[991,706,992],"collection":[],"class_list":["post-3118","post","type-post","status-publish","format-standard","hentry","category-news","tag-991","tag-706","tag-992"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/3118","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=3118"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/3118\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=3118"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=3118"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=3118"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=3118"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}