{"id":1151,"date":"2023-11-08T15:40:45","date_gmt":"2023-11-08T07:40:45","guid":{"rendered":"https:\/\/www.1ai.net\/?p=1151"},"modified":"2023-11-08T15:40:45","modified_gmt":"2023-11-08T07:40:45","slug":"%e6%b6%88%e6%81%af%e7%a7%b0%e4%ba%9a%e9%a9%ac%e9%80%8a%e6%8a%95%e8%b5%84%e6%95%b0%e7%99%be%e4%b8%87%e7%be%8e%e5%85%83%e5%9f%b9%e8%ae%ad%e5%b7%a8%e5%a4%a7ai%e6%a8%a1%e5%9e%8bolympus","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/1151.html","title":{"rendered":"Amazon reportedly invests millions of dollars to train giant AI model &#039;Olympus&#039;"},"content":{"rendered":"<p><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e4%ba%9a%e9%a9%ac%e9%80%8a\" title=\"[View articles tagged with [Amazon]]\" target=\"_blank\" >Amazon<\/a>Millions of dollars are being invested in training an ambitious<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%a7%e5%9e%8b%e8%af%ad%e8%a8%80%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [large-scale language model]]\" target=\"_blank\" >Large Language Models<\/a>, the model was codenamed &quot;<a href=\"https:\/\/www.1ai.net\/en\/tag\/olympus\" title=\"_Other Organiser\" target=\"_blank\" >Olympus<\/a>\u201d, hoping to work with OpenAI and Alphabet<span class=\"spamTxt\">Top<\/span>Model competition. The news attracted attention from the outside world, but Amazon declined to comment, according to two people familiar with the matter who told Reuters.<\/p>\n<p class=\"article-content__img\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-1152\" title=\"202308171550207014_1\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2023\/11\/202308171550207014_1.jpg\" alt=\"202308171550207014_1\" width=\"673\" height=\"459\" \/><\/p>\n<p>Source: The image is generated by AI, and the image is authorized by Midjourney<\/p>\n<p>According to people familiar with the matter, Olympus has 2 trillion parameters, which would make it the most powerful machine learning algorithm currently in training.<span class=\"spamTxt\">maximum<\/span>One of the models, and OpenAI&#039;s GPT-4 model is considered<span class=\"spamTxt\">optimal<\/span>Although the details have not been made public, Amazon&#039;s goal is clear: it hopes to provide more competitive artificial intelligence services by training &quot;Olympus&quot;.<\/p>\n<p>This training team<span class=\"spamTxt\">Leaders<\/span>It was Rohit Prasad, who previously served as the head of Alexa and now reports directly to Amazon CEO Andy Jassy. As Amazon&#039;s chief AI scientist, Prasad brought together researchers who had worked on Alexa AI and Amazon&#039;s scientific team to work together on training models.<\/p>\n<p>Amazon has trained some smaller models, such as &quot;Titan,&quot; and has also used them to<a href=\"https:\/\/www.1ai.net\/en\/tag\/ai%e6%a8%a1%e5%9e%8b\" title=\"[View articles tagged with [AI models]]\" target=\"_blank\" >AI Models<\/a>Startups such as Anthropic and AI21Labs have collaborated to make these models available to users of Amazon Web Services (AWS). Amazon believes that having its own models can make its services on AWS more attractive because corporate customers want access to models that perform well.<\/p>\n<p>Large language models (LLMs) are the foundational technology for AI tools that support learning from large data sets to generate human-style responses. However, training larger AI models is more expensive because a lot of computing power is required. In an earnings call in April this year, Amazon executives said the company would increase investment in LLMs and generative AI while reducing fulfillment and transportation in its retail business.<\/p>","protected":false},"excerpt":{"rendered":"<p>Amazon is investing millions of dollars in training an ambitious large-scale language model, codenamed \"Olympus,\" that it hopes will compete with top models from OpenAI and Alphabet. Amazon declined to comment on the news, according to two people familiar with the matter who spoke to Reuters. Note: Image generated by AI, image licensed to Midjourney According to the people familiar with the matter, \"Olympus\" has 2 trillion parameters, which would make it one of the largest models currently in training, while OpenAI's GPT-4 model, which is considered one of the best, has only 1 trillion parameters. . While the details have not yet been made public, Amazon's goal is clear: it wants to train \"Olympus\" to provide a more competitive model.<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[167,372,370,371],"collection":[],"class_list":["post-1151","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-olympus","tag-370","tag-371"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/1151","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=1151"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/1151\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=1151"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=1151"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=1151"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=1151"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}