{"id":3798,"date":"2024-02-09T11:46:07","date_gmt":"2024-02-09T03:46:07","guid":{"rendered":"https:\/\/www.1ai.net\/?p=3798"},"modified":"2024-02-09T11:46:29","modified_gmt":"2024-02-09T03:46:29","slug":"openai%e7%bb%84%e5%bb%ba%e5%84%bf%e7%ab%a5%e5%ae%89%e5%85%a8%e5%9b%a2%e9%98%9f-%e9%98%b2%e6%ad%a2ai%e5%b7%a5%e5%85%b7%e8%a2%ab%e8%af%af%e7%94%a8","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/3798.html","title":{"rendered":"OpenAI forms child safety team to prevent misuse of AI tools"},"content":{"rendered":"<p>Under scrutiny from parents and activists, a well-known artificial intelligence company<a href=\"https:\/\/www.1ai.net\/en\/tag\/openai\" title=\"[View articles tagged with [OpenAI]]\" target=\"_blank\" >OpenAI<\/a>A new child safety team has been formed to look at ways to prevent<a href=\"https:\/\/www.1ai.net\/en\/tag\/ai%e5%b7%a5%e5%85%b7\" title=\"[SEE ARTICLES WITH [AI TOOL] LABELS]\" target=\"_blank\" >AI Tools<\/a>quilt<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%9c%aa%e6%88%90%e5%b9%b4%e4%ba%ba\" title=\"[Sees articles with [minor] labels]\" target=\"_blank\" >Minors<\/a>Methods of misuse or abuse.<\/p>\n<p>According to the new recruitment information on the OpenAI career page, the team will work with policy, legal and investigative teams within the company and external partners to manage \u201cprocesses, events and reviews\u201d related to underage users. An expert in child safety law enforcement is urgently needed to implement OpenAI's policy on the generation of content in AI and to deal with the vetting of content related to \u201csensitive\u201d children\u3002<\/p>\n<p class=\"article-content__img\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-3799\" title=\"202306301133123475_1\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/02\/202306301133123475_1.jpg\" alt=\"202306301133123475_1\" width=\"1000\" height=\"666\" \/><\/p>\n<p>Source Note: The image is generated by AI, and the image is authorized by Midjourney<\/p>\n<p>Large technology companies generally invest a lot of resources in complying with regulations such as the Children&#039;s Online Privacy Protection Act, which requires control over online information and related data collection that children can access. Therefore, it is not surprising that OpenAI hired child safety experts, especially considering that there may be a large number of underage users in the future. (OpenAI currently requires 13-18 year olds to obtain parental permission and prohibits children under 13 from using it).<\/p>\n<p>The creation of the new team also shows OpenAI\u2019s cautious attitude towards violating relevant policies. Previously, the company worked with Common Sense Media to develop child-friendly AI guidelines and obtained<span class=\"spamTxt\">The first<\/span>Education industry customers.<\/p>\n<p>As more and more children and teenagers begin to use artificial intelligence to solve academic and personal problems, concerns about negative effects are also increasing. A survey found that about 30% of children use ChatGPT to deal with anxiety, some use it to repair interpersonal relationships, and some worry that it will be used for negative purposes such as creating credible false information.<\/p>\n<p>Last year, some schools banned the use of ChatGPT, but the ban was later lifted. But calls for a policy on the use of AI for children are growing, including a call from UNESCO for the government to regulate the use of AI in education in China, including age restrictions and data protection.<\/p>\n<p>Obviously, there are still major challenges for companies like OpenAI to ensure that underage users can use AI tools. Establishing a child safety team will undoubtedly help the company and the industry create a safer and more responsible AI use environment for children.<\/p>","protected":false},"excerpt":{"rendered":"<p>Under the supervision of parents and activists, the well-known artificial intelligence company OpenAI recently formed a new child safety team to study ways of preventing the misuse or misuse of its tools by minors. According to the new recruitment information on the OpenAI career page, the team will work with policy, legal and investigative teams within the company and external partners to manage \u201cprocesses, events and reviews\u201d related to underage users. An expert in child safety law enforcement is urgently needed to implement OpenAI's policy on the generation of content in AI and to deal with the vetting of content related to \u201csensitive\u201d children. Source Note: Image generated by AI, photo authorized service provider Midjourney, Large Technology Corporation<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[387,190,1208],"collection":[],"class_list":["post-3798","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-openai","tag-1208"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/3798","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=3798"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/3798\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=3798"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=3798"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=3798"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=3798"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}