{"id":28538,"date":"2025-02-17T09:33:01","date_gmt":"2025-02-17T01:33:01","guid":{"rendered":"https:\/\/www.1ai.net\/?p=28538"},"modified":"2025-02-11T21:35:42","modified_gmt":"2025-02-11T13:35:42","slug":"stable-diffusion%e6%80%8e%e4%b9%88%e7%94%a8%ef%bc%9fstable-diffusion-lora%e6%a8%a1%e5%9e%8b%e6%80%8e%e4%b9%88%e7%94%a8%ef%bc%8clora%e6%a8%a1%e5%9e%8b%e4%bd%bf%e7%94%a8%e6%94%bb%e7%95%a5%e6%95%99","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/28538.html","title":{"rendered":"Stable Diffusion how to use? stable diffusion lora model how to use, lora model use strategy tutorials"},"content":{"rendered":"<p>Today we're going to dive into one of the key concepts in AI painting technology - miniatures. In this section, we will learn together the definition of miniatures, how to download them, where to install them, and how to use them. We hope that through this article, you will be able to grasp the skills of miniatures more comprehensively.<\/p>\n<p>I. Origin of the miniatures<\/p>\n<p>In the early days of AI painting technology, only large models existed, and they didn't work well. In order to improve the capabilities of the big models, people started trying to tweak them, but this was costly. Thus, the small model technique was born, which allows us to make the big models learn new knowledge without retraining them.<\/p>\n<p>II. Classification of miniatures<\/p>\n<p>The miniatures are divided into three main categories:<strong>Lora<\/strong>,<strong>Embedding<\/strong>and<strong>Hypernetwork<\/strong>.<\/p>\n<p>Over time, Lora has become the mainstream choice for its powerful results and comprehensive capabilities.<\/p>\n<p>Embedding is favored for its small size and ease of use, but the ability to adjust is not as good as Lora.Hypernetwork is gradually being phased out.<\/p>\n<p>III. Definition and role of Lora<\/p>\n<p>Lora, a kind of small model, is officially defined as a Low-Rank Approximation (LRA) technique for large language models, which reduces the number of parameters and computation, and improves training efficiency and generation quality. Simply put, Lora is like putting a scope on a large model of AI painting so that it can learn new styles or concepts.<\/p>\n<p>IV. How to download Lora<\/p>\n<p>We can download Lora in two ways:<\/p>\n<p><strong>C Download<\/strong>: Filter and download favorite Lora models at Station C (civitai.com).<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28539\" title=\"7b695fb2j00srithz00fpd000u000esm\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/7b695fb2j00srithz00fpd000u000esm.jpg\" alt=\"7b695fb2j00srithz00fpd000u000esm\" width=\"1080\" height=\"532\" \/><\/p>\n<p><strong>Launcher Download<\/strong>: Download and automatically install the Lora model in the Model Manager of the launcher.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28540\" title=\"9cf3a16bj00srithz0082d000u000g9m\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/9cf3a16bj00srithz0082d000u000g9m.jpg\" alt=\"9cf3a16bj00srithz0082d000u000g9m\" width=\"1080\" height=\"585\" \/><\/p>\n<p>B station UP owner @ReliableXuanXuan provides a one-stop resource pack for newbies, which can also be downloaded here: https:\/\/pan.quark.cn\/s\/218e0e20a915<\/p>\n<p>V. Lora's installation address<\/p>\n<p>If you installed Lora through the launcher, it will automatically install it for you. If you downloaded it from the C site or from a web disk, you need to copy the files to the<a href=\"https:\/\/www.1ai.net\/en\/tag\/stable-diffusion\" title=\"_Other Organiser\" target=\"_blank\" >Stable Diffusion<\/a>under the root directory and then invoked through the additional web interface of the launcher.<\/p>\n<p>models\/Lora<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28541\" title=\"f1a924d5j00srithz000vd000u0004om\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/f1a924d5j00srithz000vd000u0004om.jpg\" alt=\"f1a924d5j00srithz000vd000u0004om\" width=\"1080\" height=\"168\" \/><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28542\" title=\"ffef11e6j00srithz0010d000u0007em\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/ffef11e6j00srithz0010d000u0007em.jpg\" alt=\"ffef11e6j00srithz0010d000u0007em\" width=\"1080\" height=\"266\" \/><\/p>\n<p>VI. Use of Lora<\/p>\n<p>The five steps to using Lora are summarized below:<\/p>\n<ol>\n<li><strong>Read author information<\/strong>: Understanding the larger model corresponding to the Lora model is key to ensuring effectiveness.<\/li>\n<li><strong>Open the additional network interface<\/strong>: Invoke Lora with a cue word and fill in the appropriate weight range.<\/li>\n<li>\u00a0**Use Trigger Words:** Some Lora's require specific cue words to work.<\/li>\n<li>**Generated with reference to author parameters:** Fill in various parameters with reference to author information.<\/li>\n<li><strong>Adjustments based on generated results<\/strong>: Adjust the weights as needed; the larger the weight value, the greater the impact of Lora on the screen; when the weights are negative, the impact of Lora can be reduced.<\/li>\n<\/ol>\n<p>VII. Lora case: blind box production<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28543\" title=\"a3cd71e4j00srithz00bod000u000esm\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/a3cd71e4j00srithz00bod000u000esm.jpg\" alt=\"a3cd71e4j00srithz00bod000u000esm\" width=\"1080\" height=\"532\" \/><\/p>\n<p>Blind Box Lora with different big models have different effects, it is best to use the official recommended big models (revAnimated) and the author out of the picture to keep consistent.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28544\" title=\"b4b0771ej00srithz00byd000u0006bm\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/b4b0771ej00srithz00byd000u0006bm.jpg\" alt=\"b4b0771ej00srithz00byd000u0006bm\" width=\"1080\" height=\"227\" \/><\/p>\n<p>VIII. Summary<\/p>\n<p>With this article, we hope to help you better understand and use miniatures to enhance your AI painting.<\/p>\n<p>You are welcome to browse and download your favorite Lora models and practice them on the C site.<\/p>","protected":false},"excerpt":{"rendered":"<p>Today we will explore in depth one of the key concepts in AI painting technology \u2014 small models. In this section, we will work together to learn about the definition of small models, how to download them, where to install them and how to use them. It is hoped that, through this article, you will have a more comprehensive grasp of the techniques of small models. First, the origin of small models, in the early days of AI painting technology, only large models existed, and their effects were not ideal. In order to enhance the capabilities of large models, attempts are being made to adjust them, but this is costly. As a result, small-model technology has emerged that allows us to learn new knowledge without retraining large models. The classification of small models is divided into three main categories: Lora, Embeding and Hypernetwork. Over time<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[149,144],"tags":[2328,197,198],"collection":[262],"class_list":{"0":"post-28538","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"hentry","6":"category-jiaocheng","7":"category-baike","8":"tag-ai","9":"tag-stable-diffusion","11":"collection-stablediffusion"},"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/28538","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=28538"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/28538\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=28538"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=28538"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=28538"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=28538"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}