{"id":15545,"date":"2024-07-14T09:52:07","date_gmt":"2024-07-14T01:52:07","guid":{"rendered":"https:\/\/www.1ai.net\/?p=15545"},"modified":"2024-07-14T09:52:07","modified_gmt":"2024-07-14T01:52:07","slug":"%e7%be%8e%e5%9b%bd%e5%8f%82%e8%ae%ae%e9%99%a2%e6%96%b0ai%e7%9b%b8%e5%85%b3%e6%b3%95%e6%a1%88%ef%bc%9a%e4%b8%ba%e9%98%b2ai%e6%8a%84%e8%a2%ad%e4%be%b5%e6%9d%83-%e7%a6%81%e6%ad%a2%e9%9d%9e%e6%b3%95","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/15545.html","title":{"rendered":"New AI-related bill in the U.S. Senate: Prohibiting illegal removal of digital watermarks to prevent AI plagiarism and infringement"},"content":{"rendered":"<p>With the rapid development of artificial intelligence technology, how to ensure that the works of content creators are not used illegally has become a global focus.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e7%be%8e%e5%9b%bd\" title=\"_Other Organiser\" target=\"_blank\" >USA<\/a><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%8f%82%e8%ae%ae%e9%99%a2\" title=\"[Sees articles with labels]\" target=\"_blank\" >Senate<\/a>A bipartisan group of lawmakers recently proposed a new<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%b3%95%e6%a1%88\" title=\"_Other Organiser\" target=\"_blank\" >bill<\/a>- The COPIED Act, which aims to simplify the verification and detection of content generated by AI and protect journalists and artists from having their work used by AI models without permission.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-15546\" title=\"202005261133094200_2\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/07\/202005261133094200_2.jpg\" alt=\"202005261133094200_2\" width=\"600\" height=\"400\" \/><\/p>\n<p>Source: The image is generated by AI, and the image is authorized by Midjourney<\/p>\n<p>The COPIED Act, or the &quot;Content Origin Protection and Deep Fake Media Integrity Act,&quot; requires the National Institute of Standards and Technology (NIST) to develop standards and guidelines that help prove the origin of content and detect synthetic content, such as through watermarking. The bill also requires the agency to create security measures to prevent tampering and requires AI tools used for creative or news content to allow users to attach information about its source and prohibit the removal of this information. Such content also cannot be used to train AI models, according to the bill.<\/p>\n<p>Content owners, including broadcasters, artists and newspapers, would be able to sue companies they believe are using their material without permission or tampering with certification marks. State attorneys general and the Federal Trade Commission could also enforce the bill, which supporters say would prohibit anyone from &quot;removing, disabling or tampering with content provenance information&quot; under certain exceptions for security research purposes.<\/p>\n<p>The COPIED Act is the latest step in the Senate\u2019s efforts to understand and regulate AI technology. Senate Majority Leader Chuck Schumer (D-NY) has led an effort to develop an AI roadmap for the Senate but has made clear that new laws will be developed in committees. The COPIED Act has the support of a powerful committee leader, Senate Commerce Committee Chairwoman Maria Cantwell (D-WA). Senate AI Task Force member Martin Heinrich (D-NM) and Commerce Committee member Marsha Blackburn (R-TN) are also pushing the bill.<\/p>\n<p>Several publishing and artist groups, including the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA), the Recording Industry Association of America, the Press\/Media Alliance and the Artists Rights Coalition, issued statements welcoming the introduction of the bill.<\/p>\n<p>\u201cAI\u2019s ability to generate stunningly accurate digital performers poses a real and imminent threat to our members\u2019 economic and reputational well-being, and autonomy,\u201d said Duncan Crabtree-Irish, SAG-AFTRA national executive director and chief negotiator, in a statement. \u201cWe need a fully transparent and accountable supply chain for generative AI and the content it creates to protect everyone\u2019s fundamental right to control the use of their face, voice, and personality.\u201d<\/p>","protected":false},"excerpt":{"rendered":"<p>With AI technology advancing at a rapid pace, the question of how to ensure that content creators' work is not used illegally has become a global concern. A bipartisan group of lawmakers in the U.S. Senate recently introduced a new bill, the COPIED Act, which aims to streamline the verification and detection of AI-generated content and to protect journalists' and artists' work from unauthorized use by AI models. Image Source Remarks:Image generated by AI, image licensing provider Midjourney The COPIED Act, or the Content Source Protection and Deep Fake Media Integrity Act, would require the National Institute of Standards and Technology (NIST) to develop standards and guidelines that would help prove the source of content and detect synthetic content, for example through watermarking techniques. through watermarking techniques, for example. The bill also requires the agency to create security measures to prevent the tampering of<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[3487,3488,3486,849,236],"collection":[],"class_list":["post-15545","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-3488","tag-3486","tag-849","tag-236"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/15545","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=15545"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/15545\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=15545"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=15545"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=15545"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=15545"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}