{"id":4293,"date":"2024-02-24T08:20:32","date_gmt":"2024-02-24T00:20:32","guid":{"rendered":"https:\/\/www.1ai.net\/?p=4293"},"modified":"2024-02-24T08:20:32","modified_gmt":"2024-02-24T00:20:32","slug":"%e4%ba%9a%e9%a9%ac%e9%80%8a%e8%ad%a6%e5%91%8a%e5%91%98%e5%b7%a5%ef%bc%9a%e5%b7%a5%e4%bd%9c%e4%b8%ad%e7%a6%81%e6%ad%a2%e4%bd%bf%e7%94%a8%e7%ac%ac%e4%b8%89%e6%96%b9%e7%94%9f%e6%88%90%e5%bc%8f-ai","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/4293.html","title":{"rendered":"Amazon warns employees not to use third-party generative AI tools at work"},"content":{"rendered":"<p data-vmark=\"be34\">According to Business Insider.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e4%ba%9a%e9%a9%ac%e9%80%8a\" title=\"[View articles tagged with [Amazon]]\" target=\"_blank\" >Amazon<\/a>An internal company email warned employees against putting third-party generative <a href=\"https:\/\/www.1ai.net\/en\/tag\/ai%e5%b7%a5%e5%85%b7\" title=\"[SEE ARTICLES WITH [AI TOOL] LABELS]\" target=\"_blank\" >AI Tools<\/a>For work purposes.<\/p>\n<p data-vmark=\"5e30\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-4294\" title=\"daf3450b-37b6-4b8b-a2a1-39f798d4b6c7\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/02\/daf3450b-37b6-4b8b-a2a1-39f798d4b6c7.png\" alt=\"daf3450b-37b6-4b8b-a2a1-39f798d4b6c7\" width=\"767\" height=\"625\" \/><\/p>\n<p data-vmark=\"7ac6\">According to internal documents obtained by Business Insider, Amazon warned employees in an email that \"while generative AI tools may seem to make our jobs easier, the<span class=\"accentTextColor\">But we have to make sure we don't use it for any work that contains Amazon's confidential information<\/span>.\" The email also emphasized, \"When using third-party generative AI tools, never share any Amazon, customer, or employee confidential data. Typically, confidential data is non-public data.\"<\/p>\n<p data-vmark=\"345d\">Amazon's internal policy document for third-party generative AI use and interactions states that companies providing generative AI services may claim license rights or ownership over employee input, such as OpenAI's ChatGPT tool. The policy clarifies, \"This means that Generative AI owners may extract, review, use, and distribute any output, including emails, FAQs, internal wiki pages, code, confidential information, documents, pre-releases, and strategic materials. As a result, all Amazon employees must adhere to Amazon's standard policies regarding confidential information and security when using Generative AI.\"<\/p>\n<p data-vmark=\"3fe7\">Around the same time last year, an Amazon corporate lawyer informally warned employees not to provide OpenAI's ChatGPT with any confidential Amazon information, including Amazon code under development. He said there had been cases where ChatGPT's responses were similar to internal Amazon data. According to internal sources, Amazon employees were already using the AI tool as a software \"coding assistant\".<\/p>\n<p data-vmark=\"2c3e\">As generative AI tools become more prevalent, issues such as how to handle confidential information and ownership of input and output information remain largely unresolved. Amazon may be particularly sensitive to this, as its main competitor, Microsoft, has not only invested heavily in OpenAI, but also has its own generative AI offerings.<\/p>\n<p data-vmark=\"b634\">Although Amazon had previously banned employees from using Microsoft software, it has recently eased up. According to Business Insider, Amazon recently signed a $1 billion (IT Home note: currently about 7.2 billion yuan), five-year Microsoft 365 license agreement with Microsoft.<\/p>\n<p data-vmark=\"4246\">Amazon's internal generative AI policy states that employees can use third-party models for work purposes, subject to approval from supervisors and the legal department, as well as compliance with relevant security reviews. The recent email also notes that some employees have access to Amazon's internal tool Bedrock, which is considered a \"safer alternative.\"<\/p>\n<p data-vmark=\"99a4\">Amazon spokesman Adam Montgomery said the company has been developing generative AI and<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%a4%a7%e5%9e%8b%e6%9c%ba%e5%99%a8%e5%ad%a6%e4%b9%a0%e6%a8%a1%e5%9e%8b\" title=\"[Sees articles with tags]\" target=\"_blank\" >Large Machine Learning Models<\/a>, whose AI models are used by employees on a daily basis, \"We have put in place appropriate security measures for employee use of these technologies, including guidance on accessing third-party generative AI services and protecting confidential information.\"<\/p>","protected":false},"excerpt":{"rendered":"<p>An internal Amazon.com email warned employees against using third-party generative AI tools for work purposes, Business Insider reports. According to internal documents obtained by Business Insider, Amazon warned employees in an email that \"While generative AI tools may seem to make our jobs easier, we must ensure that we don't use them for any work that contains confidential Amazon information.\" The email also emphasized, \"When using third-party generative AI tools, never share any Amazon, customer, or employee confidential data. Typically, confidential data is defined as non-public data.\" Amazon's internal policy document for third-party generative AI use and interactions states that providing generative<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[387,370,1320],"collection":[],"class_list":["post-4293","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-370","tag-1320"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/4293","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=4293"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/4293\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=4293"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=4293"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=4293"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=4293"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}