{"id":28144,"date":"2025-02-05T18:30:42","date_gmt":"2025-02-05T10:30:42","guid":{"rendered":"https:\/\/www.1ai.net\/?p=28144"},"modified":"2025-02-05T18:30:42","modified_gmt":"2025-02-05T10:30:42","slug":"%e5%8a%a0%e5%b7%9e%e6%8b%9f%e5%87%ba%e6%96%b0%e8%a7%84%ef%bc%9aai-%e5%85%ac%e5%8f%b8%e5%bf%85%e9%a1%bb%e5%ae%9a%e6%9c%9f%e6%8f%90%e9%86%92%e5%84%bf%e7%ab%a5%e8%81%8a%e5%a4%a9%e6%9c%ba%e5%99%a8","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/28144.html","title":{"rendered":"California Proposes New Rule: AI Companies Must Regularly Remind Children That Chatbots Aren't Human"},"content":{"rendered":"<p>February 5 News.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e7%be%8e%e5%9b%bd\" title=\"_Other Organiser\" target=\"_blank\" >USA<\/a>A new bill (SB 243) has been introduced in California that would require artificial intelligence (<a href=\"https:\/\/www.1ai.net\/en\/tag\/ai\" title=\"[View articles tagged with [AI]]\" target=\"_blank\" >AI<\/a>) The company regularly reminds children that<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%81%8a%e5%a4%a9%e6%9c%ba%e5%99%a8%e4%ba%ba\" title=\"[View articles tagged with [chatbot]]\" target=\"_blank\" >Chatbots<\/a>It's AI, not humans. This bill, sponsored by Senator Steve Padilla of California.<strong>Aims to protect children from the \"addictive, isolating and influential\" aspects of AI.<\/strong><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-28145\" title=\"93aa51a6j00sr7h5u001sd000st00dbp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/02\/93aa51a6j00sr7h5u001sd000st00dbp.jpg\" alt=\"93aa51a6j00sr7h5u001sd000st00dbp\" width=\"1037\" height=\"479\" \/><\/p>\n<p>1AI notes that<strong>In addition to restricting the company's use of \"addictive modes of interaction,\" the bill would require AI companies to file annual reports with the state Department of Health Care Services.<\/strong>, detailing the number of times the platform detected suicidal thoughts in children and the number of times the chatbot mentioned the topic. In addition, the bill requires companies to inform users that their chatbots may not be suitable for certain children.<\/p>\n<p>Last year, a parent filed a wrongful death lawsuit against Character.AI, alleging that the company's customized AI chatbot was \"unreasonably dangerous\" because her child committed suicide after continuing to communicate with the chatbot. Another lawsuit accused the company of sending \"harmful content\" to teenagers. Subsequently, Character.AI announced that it was working on parental controls and had developed a new AI model for teenage users to block \"sensitive or suggestive\" output.<\/p>\n<p>Senator Padilla said in a press conference, \"<strong>Our children are not guinea pigs for tech companies to experiment on at the cost of their mental health.<\/strong>. We need common-sense protections for chatbot users to prevent developers from employing tactics that they know are addictive and predatory.\"<\/p>\n<p>AI chatbots may soon be the next regulatory target for lawmakers as U.S. states and the federal government ramp up their efforts to regulate the safety of social media platforms.<\/p>","protected":false},"excerpt":{"rendered":"<p>February 5, 2012 - A new bill (SB 243) has been introduced in California that would require artificial intelligence (AI) companies to regularly remind children that chatbots are AI and not human. The bill, introduced by California State Senator Steve Padilla, aims to protect children from the \"addictive, isolating, and influential\" aspects of AI. 1AI notes that in addition to restricting companies' use of \"addictive modes of interaction,\" the bill would require AI companies to submit an annual report to the state Department of Health Care Services detailing the number of times the platforms detected suicidal thoughts in children, as well as the number of times the chatbot mentioned the topic. In addition, the bill would require companies to inform users that their chatbots may not be appropriate for certain children.<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[411,236,275],"collection":[],"class_list":["post-28144","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-236","tag-275"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/28144","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=28144"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/28144\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=28144"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=28144"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=28144"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=28144"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}