{"id":13282,"date":"2024-06-15T11:58:46","date_gmt":"2024-06-15T03:58:46","guid":{"rendered":"https:\/\/www.1ai.net\/?p=13282"},"modified":"2024-06-15T12:00:14","modified_gmt":"2024-06-15T04:00:14","slug":"%e4%bb%80%e4%b9%88%e6%98%af%e8%84%b1%e8%a1%a3ai%ef%bc%9f%e7%bb%99%e7%88%b6%e6%af%8d%e5%92%8c%e7%85%a7%e9%a1%be%e8%80%85%e7%9a%84%e6%8c%87%e5%af%bc","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/13282.html","title":{"rendered":"What is undressing AI? What is AI &quot;one-click undressing&quot;?"},"content":{"rendered":"<h2><strong><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-13283\" title=\"what-is-undress-ai-feature.webp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/06\/what-is-undress-ai-feature.webp.jpg\" alt=\"what-is-undress-ai-feature.webp\" width=\"820\" height=\"431\" \/><\/strong><\/h2>\n<h2><strong>What is &quot;undressing AI&quot;?<\/strong><\/h2>\n<p><a href=\"https:\/\/www.1ai.net\/en\/tag\/undress-ai\" title=\"_Other Organiser\" target=\"_blank\" >Undress AI<\/a>Describes a tool that uses artificial intelligence to remove a person&#039;s clothing from an image.<\/p>\n<p>While each app or website may work a little differently, they all offer similar services. Although the manipulated image doesn\u2019t actually show the victim\u2019s real nudity, it can hint at it.<\/p>\n<p>Perpetrators using strip AI tools may keep these images for themselves, or may share them more widely. They may use these images for sexual coercion (sextortion), bullying\/abuse, or as a form of revenge porn.<\/p>\n<p>If someone uses this technology to &quot;undress&quot; children and adolescents, they will face additional harm. A report by the Internet Watch Foundation found more than 11,000 AI-generated potentially criminal images of children on a dark web forum dedicated to child sexual abuse material (CSAM). They assessed approximately 3,000 images as criminal images.<\/p>\n<p>The IWF said it also found &quot;many examples of AI-generated images that include known victims and famous children.&quot; Generative AI can only create convincing images if it learns from accurate source material. Essentially, AI tools that generate CSAM need to learn from real images of child abuse.<\/p>\n<h2>Risks to watch out for<\/h2>\n<p>Undressing AI tools use suggestive language to draw users in. Therefore, children are more likely to follow their curiosity based on this language.<\/p>\n<p>Children and teenagers may not yet be aware of the law. Therefore, it may be difficult for them to distinguish harmful tools from those that promote harmless fun.<\/p>\n<h2>Inappropriate Content and Conduct<\/h2>\n<p>The curiosity and novelty of strip AI tools could expose children to inappropriate content. Because it doesn\u2019t show \u201creal\u201d nude images, they may think it\u2019s OK to use these tools. If they then share that image with a friend \u201cfor a laugh,\u201d they could be breaking the law without even knowing it.<\/p>\n<p>Without intervention from a parent or caregiver, they may continue with the behavior, even if it hurts others.<\/p>\n<h2>Privacy and security risks<\/h2>\n<p>Many legitimate generative AI tools require a fee or subscription to create images. So if a Deepnude site is free, it could produce low-quality images or have lax security. If a child uploads a dressed picture of themselves or a friend, the site or app could abuse it. This includes the Deepnudes it creates.<\/p>\n<p>Children using these tools are unlikely to read the terms of service or privacy policy, so they run the risk of not understanding them.<\/p>\n<h2>Production of Child Sexual Abuse Material (CSAM)<\/h2>\n<p>The IWF also reports that \u201cself-generated\u201d CSAM circulating online increased by 4,17% from 2019 to 2022. Note that the term \u201cself-generated\u201d is not perfect because in most cases, abusers force children to create these images.<\/p>\n<p>However, with the use of undressing AI, children could be unknowingly creating AI-generated CSAM. If they upload a dressed photo of themselves or another child, someone could \u201cnude\u201d that image and share it more widely.<\/p>\n<h2>Cyberbullying, abuse and harassment<\/h2>\n<p>Just like other types of deepfakes, people can use deepfake AI tools, or &quot;deepnudes,&quot; to bully others. This could include claiming that a peer sent a nude of themselves when they didn&#039;t. Or, it could include using AI to create a nude with the characteristics of a bully and then mocking it.<\/p>\n<p>It&#039;s important to remember that sharing nude photos of your peers is both illegal and abusive.<\/p>\n<h2>How common is &quot;deep nude&quot; technology?<\/h2>\n<p>Research shows that the use of such AI tools is increasing, especially when it comes to removing the clothes of female victims.<\/p>\n<p>One striptease AI website said their technology \u201cdoes not work with male subjects\u201d. This is because they used images of women to train the tool, which is true for most such AI tools. Of the AI-generated CSAM investigated by the Internet Watch Foundation, 99.6% featured female children.<\/p>\n<p>Graphika\u2019s research highlights that referral link spam from strip AI services will increase by 2023% by 2000. The report also found that 34 of these providers received more than 240,000 unique visitors to their sites in a single month. They predict that \u201cthere will be more cyber harm incidents,\u201d including sextortion and CSAM.<\/p>\n<p>Offenders may continue to target girls and women more than boys and men, especially if the tools learn primarily from images of women.<\/p>\n<h2>What does UK law say?<\/h2>\n<p>Until recently, those who created explicit deepfake images were not breaking the law unless the images were of children.<\/p>\n<p>However, the Ministry of Justice announced new laws this week that will change that. Under the new laws, those who create sexually explicit deepfake images without the consent of an adult will face prosecution. Those convicted will also face &quot;unlimited fines.&quot;<\/p>\n<p>This contrasts with a statement published in early 2024 that the creation of deepfake intimate images was &quot;not sufficiently harmful or culpable to constitute a criminal offence&quot;.<\/p>\n<p>As recently as last year, perpetrators could create and share these (adult) images without breaking the law. However, the Online Safety Act 2024, due in March, will make it an offence to share intimate images generated by artificial intelligence without consent.<\/p>\n<p>Generally speaking, the law should cover any image of a sexual nature. This includes works that feature nudity or partial nudity as their subject matter.<\/p>\n<p>One thing to note, however, is that this law relies on the intent to cause harm. So, the person who creates the sexually explicit deepfakes must be doing so to humiliate or otherwise harm the victim. The problem is that proving intent is fairly difficult. So, it may be difficult to actually prosecute groups that create sexually explicit deepfakes.<\/p>","protected":false},"excerpt":{"rendered":"<p>What is \"Undress AI\"? Undress AI describes a tool that uses artificial intelligence to remove a person's clothes from an image. While each app or website may work differently, they all offer a similar service. Although the processed image does not actually show the victim's actual nudity, it can imply it. Offenders who use stripping AI tools may keep these images for themselves, or may share them more widely. They could use these images for sexual coercion (sexual blackmail), bullying\/abuse or as a form of revenge porn. Children and young people face additional harm if they are 'undressed' using this technology. A report by the Internet Watch Foundation (IWF) in a study dedicated to child sexual abuse.<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[144],"tags":[3097,3096],"collection":[],"class_list":["post-13282","post","type-post","status-publish","format-standard","hentry","category-baike","tag-undress-ai","tag-ai"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/13282","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=13282"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/13282\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=13282"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=13282"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=13282"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=13282"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}