{"id":4284,"date":"2024-02-24T08:15:41","date_gmt":"2024-02-24T00:15:41","guid":{"rendered":"https:\/\/www.1ai.net\/?p=4284"},"modified":"2024-02-24T08:16:00","modified_gmt":"2024-02-24T00:16:00","slug":"%e5%be%ae%e8%bd%af%e5%8f%91%e5%b8%83-pyrit-%e5%b7%a5%e5%85%b7%ef%bc%8c%e5%b8%ae%e4%b8%93%e5%ae%b6%e5%92%8c%e5%b7%a5%e7%a8%8b%e5%b8%88%e8%af%86%e5%88%ab%e7%94%9f%e6%88%90%e5%bc%8f-ai-%e6%a8%a1%e5%9e%8b","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/4284.html","title":{"rendered":"Microsoft releases PyRIT tool to help experts and engineers identify risks in generative AI models"},"content":{"rendered":"<p data-vmark=\"f72e\"><a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%be%ae%e8%bd%af\" title=\"[View articles tagged with [Microsoft]]\" target=\"_blank\" >Microsoft<\/a>Recently released an open source automation framework <a href=\"https:\/\/www.1ai.net\/en\/tag\/pyrit\" title=\"_Other Organiser\" target=\"_blank\" >PyRIT<\/a>, a Python risk identification toolkit,<strong>It mainly helps security experts and machine learning engineers identify the risks of generative AI and prevent their artificial intelligence systems from getting out of control.<\/strong><\/p>\n<p data-vmark=\"2429\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-4286\" title=\"32864ca7-b277-4b43-b3f1-26e19f1b5ec1\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/02\/32864ca7-b277-4b43-b3f1-26e19f1b5ec1.jpg\" alt=\"32864ca7-b277-4b43-b3f1-26e19f1b5ec1\" width=\"624\" height=\"351\" \/><\/p>\n<p data-vmark=\"d45f\">Microsoft\u2019s AI Red Team is already using the tool to examine risks in generative AI systems, including Copilot.<\/p>\n<p data-vmark=\"3c52\">Microsoft emphasized that by making its internal tools available to the public and sharing other investments in the AI Red Team, its goal is to democratize AI safety.<\/p>\n<p data-vmark=\"b497\">Red Team is a group that plays the role of the enemy or competitor in military exercises, cybersecurity exercises, etc. Those who play the role of the friendly side are called Blue Team. Red Team is usually defined as an enemy force that improves product security by attacking the network.<\/p>\n<p data-vmark=\"7d60\">Microsoft AI Red Team has set up a cross-disciplinary group of security experts to manage complex attack exercises. The PyRIT framework works as follows:<\/p>\n<ul class=\"list-paddingleft-2\">\n<li>\n<p data-vmark=\"0040\">The PyRit Agent sends malicious prompt words to the target Gen AI system; when it receives a response from the Gen AI system, it sends a response to the PyRIT scoring engine.<\/p>\n<\/li>\n<li>\n<p data-vmark=\"7fc6\">The scoring engine sends responses to the PyRit agent; the agent then sends new prompts based on the scoring engine&#039;s feedback.<\/p>\n<\/li>\n<li>\n<p data-vmark=\"900e\">This automated process continues until the security expert achieves the desired result.<\/p>\n<\/li>\n<\/ul>\n<p data-vmark=\"b740\">Microsoft has hosted the relevant code on\u00a0<a href=\"https:\/\/github.com\/Azure\/PyRIT\">GitHub<\/a>\u00a0Interested users can read further.<\/p>","protected":false},"excerpt":{"rendered":"<p>Microsoft recently released the open-source automation framework PyRIT, a Python risk identification toolkit that focuses on helping security experts and machine learning engineers identify risks in generative AI and stop their AI systems from getting out of control. Microsoft's AI Red Team has used the tool to check for risks in generative AI systems, including Copilot. Microsoft has emphasized its goal of democratizing AI safety by making internal tools available to the public, as well as sharing the results of other investments in AI Red Team. A Red Team is a group of people who play the role of an enemy or competitor in military exercises, cybersecurity drills, etc., while those who play their own role are called Blue Teams. red team<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[1317,280],"collection":[],"class_list":["post-4284","post","type-post","status-publish","format-standard","hentry","category-news","tag-pyrit","tag-280"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/4284","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=4284"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/4284\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=4284"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=4284"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=4284"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=4284"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}