{"id":11326,"date":"2024-05-27T09:22:15","date_gmt":"2024-05-27T01:22:15","guid":{"rendered":"https:\/\/www.1ai.net\/?p=11326"},"modified":"2024-05-27T09:22:15","modified_gmt":"2024-05-27T01:22:15","slug":"%e8%b0%b7%e6%ad%8c-ceo-%e6%89%bf%e8%ae%a4-ai-%e6%91%98%e8%a6%81%e5%8a%9f%e8%83%bd%e5%ad%98%e5%9c%a8%e5%b9%bb%e8%a7%89%e9%97%ae%e9%a2%98%ef%bc%9a%e5%b0%9a%e6%97%a0%e8%a7%a3%e5%86%b3","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/11326.html","title":{"rendered":"Google CEO admits AI summary function has &quot;hallucination&quot; problem: no solution yet"},"content":{"rendered":"<p data-vmark=\"022d\"><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%b0%b7%e6%ad%8c\" title=\"[View articles tagged with [Google]]\" target=\"_blank\" >Google<\/a>Search for the new<a href=\"https:\/\/www.1ai.net\/en\/tag\/ai%e6%91%98%e8%a6%81\" title=\"[SEE ARTICLES WITH [AI SUMMARY] LABELS]\" target=\"_blank\" >AI Summary<\/a>The &quot;AI Overviews&quot; feature has been criticized recently because it often provides seriously wrong search results information. For example, it once suggested that users use glue to prevent the cheese on pizza from sliding off.<\/p>\n<p data-vmark=\"bc0f\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-11327\" title=\"71e6a1ae-eb10-40f3-9c81-bbb8854267e1\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/05\/71e6a1ae-eb10-40f3-9c81-bbb8854267e1.jpg\" alt=\"71e6a1ae-eb10-40f3-9c81-bbb8854267e1\" width=\"433\" height=\"650\" \/><\/p>\n<p>Image source: Pexels<\/p>\n<p data-vmark=\"190c\">Earlier this week, according to The Verge, Google CEO Sundar Pichai admitted in an interview that the &quot;hallucinations&quot; produced by these &quot;AI summaries&quot; features are &quot;inherent flaws&quot; in large language models (LLMs), which are the core technology of the &quot;AI summaries&quot; feature. Pichai said,<span class=\"accentTextColor\">This problem is still an unsolved problem.<\/span><\/p>\n<p data-vmark=\"5c09\">This means that while Google engineers have been working hard to fix the various strange and seriously wrong answers that appear in the &quot;AI Summary&quot; feature,<span class=\"accentTextColor\">But such problems will continue to occur.<\/span><\/p>\n<p data-vmark=\"c5a2\">However, Pichai seemed to downplay the severity of the errors. &quot;Just because the AI Summary feature sometimes gets it wrong doesn&#039;t mean it&#039;s not useful. I don&#039;t think that&#039;s the right way to look at it,&quot; he said. &quot;Have we made progress? Yes, we have. We&#039;ve made a lot of progress on our factual accuracy metrics compared to last year. The industry is improving, but the problem is not solved yet.&quot;<\/p>\n<p data-vmark=\"1e45\">Although Pichai is optimistic about the usefulness of the &quot;AI Summary&quot; feature, the error of the feature has caused an uproar online, with many users showing examples of the feature generating various wrong information. This further damages the credibility of Google&#039;s search engine, which has previously been criticized for providing users with junk results.<\/p>\n<p data-vmark=\"c5a3\">AI consultant and SEO expert Britney Mueller wrote on social media: &quot;People expect AI to be far more accurate than traditional methods, but this is not always the case! Google is taking a risky gamble in search to outperform competitors Perplexity and OpenAI, but they could be using AI for larger, more valuable use cases.&quot;<\/p>","protected":false},"excerpt":{"rendered":"<p>Google ' s search for the newly introduced \"AI Digests\" feature has recently been seriously challenged because it often provides information on the results of a serious error. For example, it has recommended that users use glue to prevent cheese from falling on the pizza. Earlier this week, according to the technocratic media The Verge, Executive Director Sandal Pichay of Google acknowledged in an interview that these \u201cAI digest\u201d functions produced \u201csatisfactory illusions\u201d that are \u201cinherent flaws\u201d in large language models (LLLM), which are the core technology of the \u201cAI digest\u201d function. According to Pichai, there is no solution to this problem. That means..<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[1095,281],"collection":[],"class_list":["post-11326","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-281"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/11326","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=11326"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/11326\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=11326"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=11326"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=11326"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=11326"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}