In recent years, with artificial intelligence (AI) The rapid development of technology.JapanEnterprises and some government departments have ventured into this field in an attempt to improve efficiency and solve problems such as manpower shortage through technological means. However, a national program to protectviolenceThe project for an AI system that is free from abuse has been declared a failure due to technical flaws.

According to the Yomiuri Shimbun, Japan's Children & Families Agency (CFA) had invested about 1 billion yen (note: currently about 48,506,000 yuan) in the development of an artificial intelligence-based system to detect child abuse. The system is not intended to replace human experts.Rather, it assists specialists responsible for the temporary custody of children in determining whether it is appropriate for them to be returned to their parents.The system is trained by analyzing 5,000 confirmed cases of child abuse. The system is trained by analyzing 5,000 confirmed cases of child abuse.New cases are scored based on 91 data points (including children's injuries, parents' attitudes, etc.) to determine the likelihood of abuse.
However, the system performed poorly in practical tests. In an analysis of 100 cases where the risk of abuse had been identified.Only 38 cases were correctly flagged by the system, while the remaining 62 were judged to be "very low risk".One child claimed that her mother had repeatedly hit her head on the ground and "half killed" her. In one case, a child claimed that his mother had beaten him "half to death" by repeatedly hitting his head on the ground, but the system gave him a very low rating of 2 to 3 out of 100. Experts noted that the system relied too heavily on physical evidence and ignored the importance of verbal testimony.At the same time, 5,000 cases is not nearly enough data to adequately train the system.. In addition, the agency acknowledged that the system had deficiencies in data collection, such as the inability to accurately record critical information such as a child's weight loss or injuries.
Based on these issues, the Child and Family Agency ultimately decided to drop the program.
As Japan faces the twin challenges of an aging population and labor shortages, child protection agencies are also facing accusations from victim advocates that Japan is putting children at risk by being too accommodating to parents when dealing with child abuse. With the rise of generative AI technologies such as large-scale language modeling (LLM), many companies are adding "AI" capabilities to their software systems, but the case of child and family agencies suggests that a lack of high-quality, high-volume data is a common bottleneck for machine learning and AI applications.