{"id":19165,"date":"2024-09-02T10:43:39","date_gmt":"2024-09-02T02:43:39","guid":{"rendered":"https:\/\/www.1ai.net\/?p=19165"},"modified":"2024-09-02T10:43:39","modified_gmt":"2024-09-02T02:43:39","slug":"%e5%be%ae%e8%bd%af%e5%89%af%e6%80%bb%e8%a3%81-vik-singh%ef%bc%9aai%e8%81%8a%e5%a4%a9%e6%9c%ba%e5%99%a8%e4%ba%ba%e9%9c%80%e5%ad%a6%e4%bc%9a%e6%b1%82%e5%8a%a9%e8%80%8c%e9%9d%9e","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/19165.html","title":{"rendered":"Microsoft Vice President Vik Singh: AI chatbots need to &quot;learn to ask for help&quot; rather than &quot;create illusions&quot;"},"content":{"rendered":"<p>On September 1, local time, according to AFP,<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%be%ae%e8%bd%af\" title=\"[View articles tagged with [Microsoft]]\" target=\"_blank\" >Microsoft<\/a>&quot;Frankly, what&#039;s missing from (generative AI) today is the ability to say &#039;yes, yes ...<strong>Hey I&#039;m not sure and I need help<\/strong>&#039;. &quot;<\/p>\n<p>Since last year, Microsoft, Google, and their competitors have been rapidly deploying generative AI applications such as ChatGPT and Gemini, which can generate a variety of content on demand and give users the illusion of &quot;omniscience.&quot; Despite the progress made in the development of generative AI, they still<strong>Hallucinations or made-up answers<\/strong>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-19166\" title=\"9ff7e173j00sj5zjv000td000q400hfm\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/09\/9ff7e173j00sj5zjv000td000q400hfm.jpg\" alt=\"9ff7e173j00sj5zjv000td000q400hfm\" width=\"940\" height=\"627\" \/><\/p>\n<p>Vik Singh insists that \u201creally smart people\u201d are working on finding ways to make chatbots more useful in the workplace.<strong>Admit and ask for help when you don\u2019t know the right answer<\/strong>.<\/p>\n<p>Meanwhile, Marc Benioff, CEO of cloud software giant Salesforce, said last week that he has seen many customers express concerns about<strong>Microsoft Copilot\u2019s misleading performance<\/strong>Getting more and more frustrated.<\/p>\n<p>In recent years, artificial intelligence has flourished, and applications such as chatbots have become increasingly popular. People can get information from these chatbots (such as ChatGPT) through simple instructions. However, these chatbots are still prone to &quot;hallucination&quot; problems, that is, providing wrong answers and sometimes even dangerous information. One of the reasons for &quot;hallucination&quot; is<strong>Inaccurate training data, insufficient generalization ability, and side effects in the data collection process<\/strong>.<\/p>","protected":false},"excerpt":{"rendered":"<p>On September 1, local time, according to AFP, Vik Singh, a Microsoft corporate vice president, said in an interview, \"Frankly, the capability that's really missing today [in generative AI] is the ability to proactively say, when the model isn't sure [whether its own answer is accurate], 'Hey, I'm not sure, I need help! '.\" Since last year, Microsoft, Google, and their competitors have been rapidly deploying generative AI apps such as ChatGPT and Gemini, which generate content on demand and give users the illusion of knowing everything. Despite advances in the development of generative AI, they can still \"hallucinate\" or fabricate answers. Vik Singh insists that \"really smart people\" are trying to find the answers.<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[270,280],"collection":[],"class_list":["post-19165","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-280"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/19165","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=19165"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/19165\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=19165"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=19165"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=19165"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=19165"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}