{"id":5102,"date":"2024-03-08T09:44:11","date_gmt":"2024-03-08T01:44:11","guid":{"rendered":"https:\/\/www.1ai.net\/?p=5102"},"modified":"2024-03-08T09:44:11","modified_gmt":"2024-03-08T01:44:11","slug":"%e6%b5%8b%e8%af%95%e6%98%be%e7%a4%baopenai-gpt%e5%9c%a8%e7%ae%80%e5%8e%86%e6%8e%92%e5%ba%8f%e4%b8%ad%e5%ad%98%e5%9c%a8%e7%a7%8d%e6%97%8f%e5%81%8f%e8%a7%81","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/5102.html","title":{"rendered":"Tests show OpenAI GPT has racial bias in resume sorting"},"content":{"rendered":"<p>Bloomberg&#039;s experiment shows that<a href=\"https:\/\/www.1ai.net\/en\/tag\/openai\" title=\"[View articles tagged with [OpenAI]]\" target=\"_blank\" >OpenAI<\/a> <a href=\"https:\/\/www.1ai.net\/en\/tag\/gpt\" title=\"_OTHER ORGANISER\" target=\"_blank\" >GPT<\/a>3.5 There is a clear racial bias in the use of fictitious names in resume sorting. This experiment randomly assigned equally qualified resumes to names that were associated with a particular race or ethnicity at least 90% from voter and census data.<\/p>\n<p>When sorting these resumes 1,000 times, GPT3.5 tended to favor names of certain ethnic groups more often, violating a benchmark for assessing job discrimination against protected groups.<span class=\"spamTxt\">advanced<\/span>Software Engineer, Retail Manager, and Financial Analyst), names associated with Black Americans were least likely to be rated by GPT in the Financial Analyst and Software Engineer roles.<span class=\"spamTxt\">Optimal<\/span>Show candidate.<\/p>\n<p class=\"article-content__img\"><img decoding=\"async\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/01\/202302112107341554_1-e1704938347507.jpg\" alt=\"Prompt to share, how to use AI to write a popular public account article?\" \/><\/p>\n<p>The experiment also showed that GPT had different gender and racial preferences in different positions. Although GPT did not consistently favor a particular group, it would select winners and losers in different situations. In addition, when testing with the less used GPT-4, obvious bias was also found.<\/p>\n<p>In response to Bloomberg&#039;s detailed questions, OpenAI said that the results of using the GPT model &quot;out of the box&quot; may not reflect the way its customers use the model. Companies usually take further measures to mitigate bias when using its technology, including fine-tuning the software&#039;s responses, managing system messages, etc.<\/p>\n<p>While generative AI techniques have attracted widespread attention for their use in the HR field, this experiment highlights the serious risk of automated discrimination when using these techniques for recruiting and hiring. Adjusting AI models for bias remains a major challenge for AI companies and researchers, and automated recruiting systems could further complicate corporate diversity efforts.<\/p>","protected":false},"excerpt":{"rendered":"<p>Experiments by Bloomberg have shown that OpenAI GPT3.5 has a significant racial bias when using fictitious names for resume sorting. The experiment was conducted by extracting names associated with a specific race or ethnicity of at least 901 TP3T from voter and census data and randomly assigning them to equally qualified resumes. When these resumes were sorted 1,000 times, GPT3.5 tended to favor names of certain ethnicities more frequently, violating the benchmark for assessing job discrimination against protected groups. Of the four positions covered by the experiment (HR Business Partner, Senior Software Engineer, Retail Manager, and Financial Analyst), names associated with Black Americans were the least likely to be ranked as top candidates by the GPT in the Financial Analyst and Software Engineer roles. The experiment also showed that GPTs, under different positions<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[271,190],"collection":[],"class_list":["post-5102","post","type-post","status-publish","format-standard","hentry","category-news","tag-gpt","tag-openai"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/5102","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=5102"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/5102\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=5102"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=5102"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=5102"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=5102"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}