{"id":4777,"date":"2024-03-02T09:14:58","date_gmt":"2024-03-02T01:14:58","guid":{"rendered":"https:\/\/www.1ai.net\/?p=4777"},"modified":"2024-03-02T09:14:58","modified_gmt":"2024-03-02T01:14:58","slug":"koala-ai%e6%a8%a1%e5%9e%8b%e9%97%ae%e4%b8%96%ef%bc%9a8gb-%e5%86%85%e5%ad%98%e5%b0%b1%e8%83%bd%e8%bf%90%e8%a1%8c%ef%bc%8c2-%e7%a7%92%e5%86%85%e7%94%9f%e6%88%90%e9%ab%98%e8%b4%a8%e9%87%8f%e5%9b%be","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/4777.html","title":{"rendered":"KOALA AI model is released: it can run with 8GB of memory and generate high-quality images within 2 seconds"},"content":{"rendered":"<p data-vmark=\"4a3b\">A South Korean scientific team recently developed a <a href=\"https:\/\/www.1ai.net\/en\/tag\/koala\" title=\"[View articles tagged with [KOALA]]\" target=\"_blank\" >KOALA<\/a> The new AI image generation model significantly reduces the hardware requirements.<strong>And it can generate high-quality images within 2 seconds.<\/strong><\/p>\n<p data-vmark=\"a2af\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-4778\" title=\"b6ba6586-18b1-4fba-b815-e9e390094415\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/03\/b6ba6586-18b1-4fba-b815-e9e390094415.jpg\" alt=\"b6ba6586-18b1-4fba-b815-e9e390094415\" width=\"1024\" height=\"869\" \/><\/p>\n<p data-vmark=\"24b0\">The key to this model is the use of a new technology called &quot;knowledge distillation&quot;, which greatly compresses the size of the open source image generation tool Stable Diffusion XL.<\/p>\n<p data-vmark=\"9013\">Stable Diffusion XL currently has a total of 2.56 billion parameters, and the Korean scientific team used &quot;knowledge distillation&quot; technology to reduce the parameters to 700 million.<\/p>\n<p data-vmark=\"8e32\">Therefore, the KOALA model does not require high-end graphics processors and complex equipment to run smoothly. It only needs 8GB of memory to generate images, and the generation time is shortened to within 2 seconds.<\/p>\n<p data-vmark=\"8f13\">Essentially, knowledge distillation filters information from a large model into a smaller model without sacrificing quality and performance. This allows the smaller model to generate high-quality images faster.<\/p>\n<p data-vmark=\"781e\">According to the team\u2019s test results, with the same prompt of \u201ca picture of an astronaut reading a book under the moon on Mars\u201d, the KOALA model takes 1.6 seconds to generate, while OpenAI\u2019s DALL-E 3 model takes 13.7 seconds and the DALL-E 2 model takes 12.3 seconds.<\/p>","protected":false},"excerpt":{"rendered":"<p>A South Korean scientific team recently developed a new artificial intelligence image generation model called KOALA, which dramatically reduces the need for hardware and can generate high-quality images in less than 2 seconds. The key to the model is a new technique called \"knowledge distillation,\" which dramatically reduces the size of the open-source image generation tool Stable Diffusion XL. Stable Diffusion XL currently has a total of 2.56 billion parameters, which the Korean scientific team was able to reduce to 700 million using knowledge distillation. As a result, the KOALA model runs smoothly without the need for high-end graphics processors and sophisticated equipment, requiring only 8GB.<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[167,1447],"collection":[],"class_list":["post-4777","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-koala"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/4777","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=4777"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/4777\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=4777"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=4777"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=4777"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=4777"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}